Dec 06 05:43:35 crc systemd[1]: Starting Kubernetes Kubelet... Dec 06 05:43:35 crc restorecon[4568]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:35 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 05:43:36 crc restorecon[4568]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 06 05:43:36 crc kubenswrapper[4733]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.337871 4733 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340888 4733 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340909 4733 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340914 4733 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340918 4733 feature_gate.go:330] unrecognized feature gate: Example Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340923 4733 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340926 4733 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340932 4733 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340937 4733 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340942 4733 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340947 4733 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340951 4733 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340954 4733 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340958 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340961 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340965 4733 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340968 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340972 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340980 4733 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340985 4733 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340989 4733 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340992 4733 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340996 4733 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.340999 4733 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341002 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341006 4733 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341009 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341013 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341016 4733 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341019 4733 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341023 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341027 4733 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341030 4733 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341035 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341039 4733 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341042 4733 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341046 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341049 4733 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341053 4733 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341056 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341059 4733 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341063 4733 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341071 4733 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341075 4733 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341078 4733 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341082 4733 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341086 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341089 4733 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341092 4733 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341095 4733 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341099 4733 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341103 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341106 4733 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341109 4733 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341113 4733 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341116 4733 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341119 4733 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341122 4733 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341125 4733 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341130 4733 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341134 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341137 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341140 4733 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341144 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341147 4733 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341150 4733 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341153 4733 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341157 4733 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341160 4733 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341163 4733 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341167 4733 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.341171 4733 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341268 4733 flags.go:64] FLAG: --address="0.0.0.0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341277 4733 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341287 4733 flags.go:64] FLAG: --anonymous-auth="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341292 4733 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341297 4733 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341316 4733 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341322 4733 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341327 4733 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341331 4733 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341336 4733 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341341 4733 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341345 4733 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341349 4733 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341353 4733 flags.go:64] FLAG: --cgroup-root="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341356 4733 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341360 4733 flags.go:64] FLAG: --client-ca-file="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341363 4733 flags.go:64] FLAG: --cloud-config="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341367 4733 flags.go:64] FLAG: --cloud-provider="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341371 4733 flags.go:64] FLAG: --cluster-dns="[]" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341375 4733 flags.go:64] FLAG: --cluster-domain="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341379 4733 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341383 4733 flags.go:64] FLAG: --config-dir="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341387 4733 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341391 4733 flags.go:64] FLAG: --container-log-max-files="5" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341395 4733 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341398 4733 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341402 4733 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341406 4733 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341410 4733 flags.go:64] FLAG: --contention-profiling="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341414 4733 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341418 4733 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341423 4733 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341427 4733 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341431 4733 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341436 4733 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341441 4733 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341445 4733 flags.go:64] FLAG: --enable-load-reader="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341449 4733 flags.go:64] FLAG: --enable-server="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341453 4733 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341458 4733 flags.go:64] FLAG: --event-burst="100" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341462 4733 flags.go:64] FLAG: --event-qps="50" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341466 4733 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341470 4733 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341474 4733 flags.go:64] FLAG: --eviction-hard="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341479 4733 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341483 4733 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341487 4733 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341491 4733 flags.go:64] FLAG: --eviction-soft="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341495 4733 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341498 4733 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341503 4733 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341506 4733 flags.go:64] FLAG: --experimental-mounter-path="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341510 4733 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341514 4733 flags.go:64] FLAG: --fail-swap-on="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341518 4733 flags.go:64] FLAG: --feature-gates="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341522 4733 flags.go:64] FLAG: --file-check-frequency="20s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341526 4733 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341530 4733 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341535 4733 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341539 4733 flags.go:64] FLAG: --healthz-port="10248" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341543 4733 flags.go:64] FLAG: --help="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341547 4733 flags.go:64] FLAG: --hostname-override="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341552 4733 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341556 4733 flags.go:64] FLAG: --http-check-frequency="20s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341560 4733 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341565 4733 flags.go:64] FLAG: --image-credential-provider-config="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341570 4733 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341582 4733 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341587 4733 flags.go:64] FLAG: --image-service-endpoint="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341592 4733 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341597 4733 flags.go:64] FLAG: --kube-api-burst="100" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341601 4733 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341605 4733 flags.go:64] FLAG: --kube-api-qps="50" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341609 4733 flags.go:64] FLAG: --kube-reserved="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341613 4733 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341616 4733 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341620 4733 flags.go:64] FLAG: --kubelet-cgroups="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341624 4733 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341627 4733 flags.go:64] FLAG: --lock-file="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341631 4733 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341635 4733 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341638 4733 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341644 4733 flags.go:64] FLAG: --log-json-split-stream="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341647 4733 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341651 4733 flags.go:64] FLAG: --log-text-split-stream="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341655 4733 flags.go:64] FLAG: --logging-format="text" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341658 4733 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341662 4733 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341666 4733 flags.go:64] FLAG: --manifest-url="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341669 4733 flags.go:64] FLAG: --manifest-url-header="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341678 4733 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341682 4733 flags.go:64] FLAG: --max-open-files="1000000" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341687 4733 flags.go:64] FLAG: --max-pods="110" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341691 4733 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341696 4733 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341700 4733 flags.go:64] FLAG: --memory-manager-policy="None" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341704 4733 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341708 4733 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341712 4733 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341716 4733 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341725 4733 flags.go:64] FLAG: --node-status-max-images="50" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341730 4733 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341733 4733 flags.go:64] FLAG: --oom-score-adj="-999" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341737 4733 flags.go:64] FLAG: --pod-cidr="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341741 4733 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341747 4733 flags.go:64] FLAG: --pod-manifest-path="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341754 4733 flags.go:64] FLAG: --pod-max-pids="-1" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341759 4733 flags.go:64] FLAG: --pods-per-core="0" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341762 4733 flags.go:64] FLAG: --port="10250" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341767 4733 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341771 4733 flags.go:64] FLAG: --provider-id="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341774 4733 flags.go:64] FLAG: --qos-reserved="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341778 4733 flags.go:64] FLAG: --read-only-port="10255" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341782 4733 flags.go:64] FLAG: --register-node="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341786 4733 flags.go:64] FLAG: --register-schedulable="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341790 4733 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341796 4733 flags.go:64] FLAG: --registry-burst="10" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341800 4733 flags.go:64] FLAG: --registry-qps="5" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341804 4733 flags.go:64] FLAG: --reserved-cpus="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341808 4733 flags.go:64] FLAG: --reserved-memory="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341812 4733 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341816 4733 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341820 4733 flags.go:64] FLAG: --rotate-certificates="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341827 4733 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341830 4733 flags.go:64] FLAG: --runonce="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341834 4733 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341838 4733 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341842 4733 flags.go:64] FLAG: --seccomp-default="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341846 4733 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341850 4733 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341854 4733 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341857 4733 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341861 4733 flags.go:64] FLAG: --storage-driver-password="root" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341865 4733 flags.go:64] FLAG: --storage-driver-secure="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341869 4733 flags.go:64] FLAG: --storage-driver-table="stats" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341872 4733 flags.go:64] FLAG: --storage-driver-user="root" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341876 4733 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341880 4733 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341883 4733 flags.go:64] FLAG: --system-cgroups="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341887 4733 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341893 4733 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341897 4733 flags.go:64] FLAG: --tls-cert-file="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341901 4733 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341905 4733 flags.go:64] FLAG: --tls-min-version="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341909 4733 flags.go:64] FLAG: --tls-private-key-file="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341912 4733 flags.go:64] FLAG: --topology-manager-policy="none" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341916 4733 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341920 4733 flags.go:64] FLAG: --topology-manager-scope="container" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341925 4733 flags.go:64] FLAG: --v="2" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341931 4733 flags.go:64] FLAG: --version="false" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341936 4733 flags.go:64] FLAG: --vmodule="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341941 4733 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.341946 4733 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342049 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342054 4733 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342059 4733 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342063 4733 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342066 4733 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342070 4733 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342075 4733 feature_gate.go:330] unrecognized feature gate: Example Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342079 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342083 4733 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342087 4733 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342090 4733 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342093 4733 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342097 4733 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342100 4733 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342103 4733 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342106 4733 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342109 4733 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342112 4733 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342115 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342119 4733 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342123 4733 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342127 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342130 4733 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342134 4733 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342138 4733 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342142 4733 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342145 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342149 4733 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342152 4733 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342155 4733 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342158 4733 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342161 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342164 4733 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342167 4733 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342172 4733 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342175 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342178 4733 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342181 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342185 4733 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342189 4733 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342192 4733 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342195 4733 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342198 4733 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342201 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342204 4733 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342207 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342211 4733 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342214 4733 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342217 4733 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342220 4733 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342223 4733 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342227 4733 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342230 4733 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342233 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342236 4733 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342239 4733 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342243 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342246 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342250 4733 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342253 4733 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342256 4733 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342260 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342264 4733 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342268 4733 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342273 4733 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342277 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342282 4733 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342286 4733 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342290 4733 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342296 4733 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.342324 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.342336 4733 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.349833 4733 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.349880 4733 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350026 4733 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350035 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350043 4733 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350051 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350056 4733 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350062 4733 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350067 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350071 4733 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350075 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350079 4733 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350085 4733 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350090 4733 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350102 4733 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350107 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350112 4733 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350118 4733 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350128 4733 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350134 4733 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350143 4733 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350152 4733 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350159 4733 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350162 4733 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350168 4733 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350172 4733 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350179 4733 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350183 4733 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350194 4733 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350197 4733 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350202 4733 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350206 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350210 4733 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350215 4733 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350221 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350224 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350229 4733 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350233 4733 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350239 4733 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350243 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350247 4733 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350250 4733 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350255 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350258 4733 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350264 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350268 4733 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350272 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350275 4733 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350279 4733 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350283 4733 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350287 4733 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350294 4733 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350298 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350316 4733 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350320 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350324 4733 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350327 4733 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350334 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350338 4733 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350342 4733 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350345 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350348 4733 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350352 4733 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350356 4733 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350363 4733 feature_gate.go:330] unrecognized feature gate: Example Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350367 4733 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350371 4733 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350375 4733 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350380 4733 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350383 4733 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350388 4733 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350392 4733 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350396 4733 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.350406 4733 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350636 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350646 4733 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350652 4733 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.350656 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355482 4733 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355500 4733 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355512 4733 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355517 4733 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355524 4733 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355528 4733 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355534 4733 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355542 4733 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355546 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355550 4733 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355557 4733 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355561 4733 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355566 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355573 4733 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355586 4733 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355592 4733 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355597 4733 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355602 4733 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355606 4733 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355609 4733 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355615 4733 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355619 4733 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355624 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355628 4733 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355632 4733 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355638 4733 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355646 4733 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355651 4733 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355656 4733 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355660 4733 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355668 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355673 4733 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355678 4733 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355682 4733 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355685 4733 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355690 4733 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355693 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355697 4733 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355703 4733 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355706 4733 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355710 4733 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355714 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355719 4733 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355724 4733 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355729 4733 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355733 4733 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355736 4733 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355740 4733 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355744 4733 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355748 4733 feature_gate.go:330] unrecognized feature gate: Example Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355753 4733 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355757 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355762 4733 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355767 4733 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355772 4733 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355776 4733 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355780 4733 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355784 4733 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355788 4733 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355792 4733 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355795 4733 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355800 4733 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355805 4733 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355811 4733 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355815 4733 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355818 4733 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.355822 4733 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.355829 4733 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.356134 4733 server.go:940] "Client rotation is on, will bootstrap in background" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.360509 4733 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.362774 4733 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.363676 4733 server.go:997] "Starting client certificate rotation" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.363704 4733 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.364430 4733 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-07 06:34:49.587539332 +0000 UTC Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.364512 4733 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 768h51m13.223030111s for next certificate rotation Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.376588 4733 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.378116 4733 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.387911 4733 log.go:25] "Validated CRI v1 runtime API" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.403938 4733 log.go:25] "Validated CRI v1 image API" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.405368 4733 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.408386 4733 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-06-05-40-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.408412 4733 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.420539 4733 manager.go:217] Machine: {Timestamp:2025-12-06 05:43:36.418969044 +0000 UTC m=+0.284180176 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4b0d62b0-e895-479e-b261-2bd12b349187 BootID:6951a1f4-5aff-463d-98ee-6da28494341b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:72:c9:30 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:72:c9:30 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:31:ce:ab Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:39:50:05 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:18:f8:27 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:90:f1:d7 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:56:d2:1b:04:c0:7b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:4c:69:6d:c7:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.420717 4733 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.420814 4733 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.421859 4733 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422034 4733 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422077 4733 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422533 4733 topology_manager.go:138] "Creating topology manager with none policy" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422552 4733 container_manager_linux.go:303] "Creating device plugin manager" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422848 4733 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.422868 4733 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.423046 4733 state_mem.go:36] "Initialized new in-memory state store" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.423285 4733 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.424954 4733 kubelet.go:418] "Attempting to sync node with API server" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.424977 4733 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.425002 4733 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.425013 4733 kubelet.go:324] "Adding apiserver pod source" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.425025 4733 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.429418 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.429421 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.429975 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.429977 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.430752 4733 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.431267 4733 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.432362 4733 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433151 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433175 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433184 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433192 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433203 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433211 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433220 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433231 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433239 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433247 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433259 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.433266 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.434224 4733 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.434651 4733 server.go:1280] "Started kubelet" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.435295 4733 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.435494 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.435494 4733 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 06 05:43:36 crc systemd[1]: Started Kubernetes Kubelet. Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.435878 4733 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.437503 4733 server.go:460] "Adding debug handlers to kubelet server" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.438375 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.438439 4733 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.438724 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 16:58:20.181190543 +0000 UTC Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.438762 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 731h14m43.742429742s for next certificate rotation Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.439007 4733 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.439083 4733 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.439086 4733 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.439061 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.439944 4733 factory.go:153] Registering CRI-O factory Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440010 4733 factory.go:221] Registration of the crio container factory successfully Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440111 4733 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440171 4733 factory.go:55] Registering systemd factory Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440214 4733 factory.go:221] Registration of the systemd container factory successfully Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440278 4733 factory.go:103] Registering Raw factory Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.440347 4733 manager.go:1196] Started watching for new ooms in manager Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.439788 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.211:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e89fb4f372f2b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 05:43:36.434626347 +0000 UTC m=+0.299837458,LastTimestamp:2025-12-06 05:43:36.434626347 +0000 UTC m=+0.299837458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.440665 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="200ms" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.443777 4733 manager.go:319] Starting recovery of all containers Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.447251 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.447555 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452659 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452698 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452712 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452724 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452736 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452746 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452757 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452767 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452780 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452792 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452802 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452812 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452823 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.452834 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453636 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453734 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453760 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453775 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453788 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453800 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453813 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453825 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.453839 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.454178 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.454199 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.454213 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455057 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455103 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455143 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455154 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455168 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455179 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455195 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455208 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455221 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455235 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455248 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455259 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455270 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455281 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455291 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455334 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455346 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455359 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455371 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455389 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455399 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455410 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455423 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455434 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455445 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455456 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455474 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455486 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455497 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455510 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455521 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455531 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455540 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455551 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455561 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455570 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455591 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455601 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455613 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455624 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455670 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455684 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455697 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455709 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455720 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455730 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455741 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455752 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455767 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455778 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455787 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455796 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455807 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455819 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455830 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455841 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455851 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455861 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455873 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455885 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455894 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455906 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455916 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455926 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455937 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455947 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455957 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455982 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.455993 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456004 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456017 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456028 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456041 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456051 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456061 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456072 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456083 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456094 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456112 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456124 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456138 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456150 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456161 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456176 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456188 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456198 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456210 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456221 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456232 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456244 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456254 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456264 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456273 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456284 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456294 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456317 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456330 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456340 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456350 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456362 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456373 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456384 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.456397 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457394 4733 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457430 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457449 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457466 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457484 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457498 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457511 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457526 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457540 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457564 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457601 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457615 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457627 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457643 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457655 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457674 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457688 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457706 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457721 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457735 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457748 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457765 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457782 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457795 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457807 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457819 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457831 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457845 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457856 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457867 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457880 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457893 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457906 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457920 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457932 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457944 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457957 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457969 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457981 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.457994 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458007 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458025 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458037 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458052 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458065 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458077 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458089 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458103 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458115 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458127 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458139 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458153 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458167 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458185 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458201 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458226 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458239 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458250 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458264 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458275 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458289 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458327 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458345 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458357 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458369 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458382 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458394 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458407 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458419 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458430 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458445 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458457 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458470 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458483 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458496 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458507 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458519 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458531 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458542 4733 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458553 4733 reconstruct.go:97] "Volume reconstruction finished" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.458561 4733 reconciler.go:26] "Reconciler: start to sync state" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.463907 4733 manager.go:324] Recovery completed Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.476108 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.478695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.478756 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.478771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.479953 4733 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.479968 4733 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.480005 4733 state_mem.go:36] "Initialized new in-memory state store" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.480685 4733 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.483193 4733 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.483226 4733 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.483523 4733 kubelet.go:2335] "Starting kubelet main sync loop" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.483598 4733 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.486083 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.486195 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.486434 4733 policy_none.go:49] "None policy: Start" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.489904 4733 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.489956 4733 state_mem.go:35] "Initializing new in-memory state store" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.534139 4733 manager.go:334] "Starting Device Plugin manager" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.534390 4733 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.534406 4733 server.go:79] "Starting device plugin registration server" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.534773 4733 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.534792 4733 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.535106 4733 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.535225 4733 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.535240 4733 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.540238 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.584160 4733 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.584315 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585147 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585263 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585467 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585501 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585872 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.585905 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586004 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586074 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586111 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586203 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586248 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586488 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586509 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586520 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586632 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586714 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586743 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586937 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.586979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587329 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587375 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587417 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587594 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587670 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.587701 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588392 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588399 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588433 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588644 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.588674 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.589286 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.589330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.589343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.635097 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.635840 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.635871 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.635882 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.635911 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.637206 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.211:6443: connect: connection refused" node="crc" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.641406 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="400ms" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.660918 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.660957 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.660981 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661003 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661022 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661041 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661061 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661078 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661097 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661117 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661157 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661174 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661190 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.661208 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762320 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762430 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762492 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762603 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762503 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762442 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762342 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762755 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762907 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762917 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.762981 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763039 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763093 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763118 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763145 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763168 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763190 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763191 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763217 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763225 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763195 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763244 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763262 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763283 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763291 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763270 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763338 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763295 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.763546 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.837803 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.838805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.838840 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.838851 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.838890 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:36 crc kubenswrapper[4733]: E1206 05:43:36.841643 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.211:6443: connect: connection refused" node="crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.908812 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.914186 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.931769 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f566a0488035dbd0190bcb2e49b8cef348e30a009cd854788345d7875ddb59a4 WatchSource:0}: Error finding container f566a0488035dbd0190bcb2e49b8cef348e30a009cd854788345d7875ddb59a4: Status 404 returned error can't find the container with id f566a0488035dbd0190bcb2e49b8cef348e30a009cd854788345d7875ddb59a4 Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.934406 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2376c9e544a3ceafa6c3b092642851a5c8ae4e11ed1b676c4e2f15bc218cb446 WatchSource:0}: Error finding container 2376c9e544a3ceafa6c3b092642851a5c8ae4e11ed1b676c4e2f15bc218cb446: Status 404 returned error can't find the container with id 2376c9e544a3ceafa6c3b092642851a5c8ae4e11ed1b676c4e2f15bc218cb446 Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.937559 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.944741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.959074 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-df0364d45e97f68a4fad130e53ad9f658552a5fc212d58c09704ee13d0ca2726 WatchSource:0}: Error finding container df0364d45e97f68a4fad130e53ad9f658552a5fc212d58c09704ee13d0ca2726: Status 404 returned error can't find the container with id df0364d45e97f68a4fad130e53ad9f658552a5fc212d58c09704ee13d0ca2726 Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.960127 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-87328a5c6804f620795995d6ee583271f0569ae00463b53b60a132c5769a846d WatchSource:0}: Error finding container 87328a5c6804f620795995d6ee583271f0569ae00463b53b60a132c5769a846d: Status 404 returned error can't find the container with id 87328a5c6804f620795995d6ee583271f0569ae00463b53b60a132c5769a846d Dec 06 05:43:36 crc kubenswrapper[4733]: I1206 05:43:36.961440 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:36 crc kubenswrapper[4733]: W1206 05:43:36.974395 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0225af603e61b322351fa273402fb5e2e98f96e2073ffd767746e559757b90ad WatchSource:0}: Error finding container 0225af603e61b322351fa273402fb5e2e98f96e2073ffd767746e559757b90ad: Status 404 returned error can't find the container with id 0225af603e61b322351fa273402fb5e2e98f96e2073ffd767746e559757b90ad Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.042920 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="800ms" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.243704 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.244805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.244852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.244865 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.244895 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.245498 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.211:6443: connect: connection refused" node="crc" Dec 06 05:43:37 crc kubenswrapper[4733]: W1206 05:43:37.330844 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.330940 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.436704 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.489753 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.490027 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"87328a5c6804f620795995d6ee583271f0569ae00463b53b60a132c5769a846d"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.491459 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f" exitCode=0 Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.491572 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.491644 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df0364d45e97f68a4fad130e53ad9f658552a5fc212d58c09704ee13d0ca2726"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.491809 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493170 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493562 4733 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3ab77ae3d8e58a31bedfcc49dffc42859bba5cb5615470a1f98999d53f844fa" exitCode=0 Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493621 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3ab77ae3d8e58a31bedfcc49dffc42859bba5cb5615470a1f98999d53f844fa"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493660 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f566a0488035dbd0190bcb2e49b8cef348e30a009cd854788345d7875ddb59a4"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.493807 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.494667 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.494703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.494714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.495292 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496013 4733 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ef5d06b0816bbcb1c63a0598cd8cc1175582cb8072f620edac05a2f115fd7f75" exitCode=0 Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496065 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ef5d06b0816bbcb1c63a0598cd8cc1175582cb8072f620edac05a2f115fd7f75"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496084 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2376c9e544a3ceafa6c3b092642851a5c8ae4e11ed1b676c4e2f15bc218cb446"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496103 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496123 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496133 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496154 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496791 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496820 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.496832 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.502534 4733 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1" exitCode=0 Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.502589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.502679 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0225af603e61b322351fa273402fb5e2e98f96e2073ffd767746e559757b90ad"} Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.502827 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.503762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.503793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:37 crc kubenswrapper[4733]: I1206 05:43:37.503807 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:37 crc kubenswrapper[4733]: W1206 05:43:37.636902 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.637000 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:37 crc kubenswrapper[4733]: W1206 05:43:37.653722 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.653766 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.843802 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="1.6s" Dec 06 05:43:37 crc kubenswrapper[4733]: W1206 05:43:37.902456 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.211:6443: connect: connection refused Dec 06 05:43:37 crc kubenswrapper[4733]: E1206 05:43:37.902574 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.211:6443: connect: connection refused" logger="UnhandledError" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.046284 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.047549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.047594 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.047609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.047646 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:38 crc kubenswrapper[4733]: E1206 05:43:38.048142 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.211:6443: connect: connection refused" node="crc" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.507037 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.507095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.507118 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.507126 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.508283 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.508338 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.508355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511728 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511758 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511770 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511792 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.511928 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.512772 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.512819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.512831 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.513786 4733 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fee2559f354a15e3571e080b9192e23848c5a6ea1824c36683f8378e0c0e732a" exitCode=0 Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.513815 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fee2559f354a15e3571e080b9192e23848c5a6ea1824c36683f8378e0c0e732a"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.513963 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.514938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.514961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.514971 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.515742 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d78f49702fa45a467671f74e635a61e2d56cd857c2844f685e12bb1e00e70a97"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.515831 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.516448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.516473 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.516486 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.518581 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.518643 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.518655 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627"} Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.518786 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.519495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.519529 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.519542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:38 crc kubenswrapper[4733]: I1206 05:43:38.997540 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.444647 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.524894 4733 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="759fac7f01a0300c0fa50ea3580eb2c5d165026c587301e95cd328351f0d9f3e" exitCode=0 Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.524933 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"759fac7f01a0300c0fa50ea3580eb2c5d165026c587301e95cd328351f0d9f3e"} Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.525101 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.525125 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.525157 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.525133 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.526997 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.648476 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.649342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.649380 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.649392 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.649420 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:39 crc kubenswrapper[4733]: I1206 05:43:39.979355 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531295 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531871 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4fb6c2ecd3dd74eb05dcf65ea8e8b157bd444d45359fdfdb3dfef3f11d333069"} Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531927 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"465e8dad046f3c609f1c2d2c241af5bae86b7fd460e0302019f92b4343f42d03"} Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531941 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ddd621188f3a68efef4b8ca1ab802338241c911990f7afa423323a48ba3489e3"} Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531955 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87072379b69dd4281287304200f630900135fd336a7fd03158b0faabcc31e139"} Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.531967 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45280459c3d964d5c1d8e3037b8cccbf0896033871a3744fbe9b84bf9755ddf0"} Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532044 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532075 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532955 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.532984 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.533487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.533582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:40 crc kubenswrapper[4733]: I1206 05:43:40.533659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:41 crc kubenswrapper[4733]: I1206 05:43:41.510149 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 06 05:43:41 crc kubenswrapper[4733]: I1206 05:43:41.532943 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:41 crc kubenswrapper[4733]: I1206 05:43:41.533797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:41 crc kubenswrapper[4733]: I1206 05:43:41.533824 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:41 crc kubenswrapper[4733]: I1206 05:43:41.533834 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.912203 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.912788 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.914060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.914121 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.914135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.985103 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.985345 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.986143 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.986175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:42 crc kubenswrapper[4733]: I1206 05:43:42.986186 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:43 crc kubenswrapper[4733]: I1206 05:43:43.516844 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 06 05:43:43 crc kubenswrapper[4733]: I1206 05:43:43.517022 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:43 crc kubenswrapper[4733]: I1206 05:43:43.518282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:43 crc kubenswrapper[4733]: I1206 05:43:43.518409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:43 crc kubenswrapper[4733]: I1206 05:43:43.518501 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.056452 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.056600 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.058662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.058721 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.058752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.066200 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.540777 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.540944 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.541790 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.541825 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:44 crc kubenswrapper[4733]: I1206 05:43:44.541835 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.542600 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.543631 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.543663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.543681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.912940 4733 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 05:43:45 crc kubenswrapper[4733]: I1206 05:43:45.913030 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 05:43:46 crc kubenswrapper[4733]: E1206 05:43:46.540387 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.170496 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.170621 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.171471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.171492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.171500 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.436842 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.458688 4733 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.458742 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.857447 4733 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.857515 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.875741 4733 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 05:43:48 crc kubenswrapper[4733]: I1206 05:43:48.875870 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 05:43:49 crc kubenswrapper[4733]: I1206 05:43:49.449776 4733 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]log ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]etcd ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/generic-apiserver-start-informers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/priority-and-fairness-filter ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-apiextensions-informers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-apiextensions-controllers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/crd-informer-synced ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-system-namespaces-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 06 05:43:49 crc kubenswrapper[4733]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/bootstrap-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/start-kube-aggregator-informers ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-registration-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-discovery-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]autoregister-completion ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-openapi-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 06 05:43:49 crc kubenswrapper[4733]: livez check failed Dec 06 05:43:49 crc kubenswrapper[4733]: I1206 05:43:49.449840 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.531575 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.532096 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.533202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.533235 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.533260 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.541534 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.556409 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.557172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.557217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:51 crc kubenswrapper[4733]: I1206 05:43:51.557227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:53 crc kubenswrapper[4733]: E1206 05:43:53.857897 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.862130 4733 trace.go:236] Trace[1863739339]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 05:43:39.853) (total time: 14008ms): Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1863739339]: ---"Objects listed" error: 14008ms (05:43:53.862) Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1863739339]: [14.008484019s] [14.008484019s] END Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.862176 4733 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.862947 4733 trace.go:236] Trace[1056310357]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 05:43:39.793) (total time: 14068ms): Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1056310357]: ---"Objects listed" error: 14068ms (05:43:53.862) Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1056310357]: [14.068906835s] [14.068906835s] END Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.862983 4733 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.863227 4733 trace.go:236] Trace[2123786242]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 05:43:39.857) (total time: 14006ms): Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[2123786242]: ---"Objects listed" error: 14006ms (05:43:53.863) Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[2123786242]: [14.006059622s] [14.006059622s] END Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.863258 4733 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.864438 4733 trace.go:236] Trace[1963315954]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 05:43:40.337) (total time: 13526ms): Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1963315954]: ---"Objects listed" error: 13526ms (05:43:53.864) Dec 06 05:43:53 crc kubenswrapper[4733]: Trace[1963315954]: [13.526732555s] [13.526732555s] END Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.864470 4733 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.865145 4733 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 06 05:43:53 crc kubenswrapper[4733]: E1206 05:43:53.865281 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.894115 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:53 crc kubenswrapper[4733]: I1206 05:43:53.897551 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.444342 4733 apiserver.go:52] "Watching apiserver" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.447377 4733 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.447643 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.447991 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448070 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448107 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448281 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.448356 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448506 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448611 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448624 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.448686 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448900 4733 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.448939 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.449003 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459395 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459454 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459474 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459630 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459647 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459705 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459848 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.459873 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.460275 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.462824 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.496283 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.507109 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.518883 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.525885 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pqsfd"] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.526210 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.527055 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cnxdh"] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.527717 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.528375 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.528410 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.528511 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.529389 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.529558 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.529851 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.529888 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.530397 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.534052 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.539957 4733 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.540540 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.548852 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.555239 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.562376 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.564016 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.565697 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc" exitCode=255 Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.565795 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc"} Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569190 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569232 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569257 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569276 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569293 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569329 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569349 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569367 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569387 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569403 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569418 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569442 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569485 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569503 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569518 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569535 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569543 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569553 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569607 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569621 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569705 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569733 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569748 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569754 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569769 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569789 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569807 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569826 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569879 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569896 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569909 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569917 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569947 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569966 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.569985 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570002 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570034 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570051 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570070 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570086 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570101 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570118 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570134 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570150 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570186 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570203 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570235 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570254 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570270 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570282 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570293 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570338 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570359 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570379 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570396 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570413 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570431 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570468 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570486 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570501 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570517 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570536 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570554 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570572 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570590 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570615 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570632 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570648 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570666 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570682 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570699 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570715 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570733 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570750 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570767 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570787 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570808 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570824 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570840 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570857 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570876 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570893 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570914 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570931 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570948 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570963 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570980 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571005 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571020 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571098 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571119 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571135 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571154 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571186 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571201 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571218 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571233 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571250 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571267 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571283 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571315 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571333 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571351 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571369 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571388 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571405 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571437 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571456 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571472 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571489 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571507 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571525 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571544 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571561 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571576 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571595 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571622 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571638 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571654 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571673 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571690 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571706 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571724 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571743 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571760 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571777 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571793 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571810 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571828 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571844 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571860 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571880 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571897 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571914 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571931 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571949 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571968 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571986 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572002 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572022 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572041 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572060 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572078 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572096 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572112 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572127 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575622 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570517 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570689 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570715 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580177 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570741 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.570898 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571069 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571288 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571287 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580319 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571293 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571472 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571692 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571704 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571726 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571744 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571746 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.571944 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572000 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572024 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572069 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.572189 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:43:55.072129815 +0000 UTC m=+18.937340916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572246 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572332 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572485 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572611 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572615 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572629 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572837 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572846 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572893 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.572985 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573016 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573163 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573380 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573415 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573470 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.573564 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.574756 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575015 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575253 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575351 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575531 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.575852 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.576035 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.576506 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.576763 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.576768 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.576935 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577034 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577060 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577098 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577426 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577493 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577669 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577729 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577888 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.577975 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.578121 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.578258 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.579535 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.579729 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.579947 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580005 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580078 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580094 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580598 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580678 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580711 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580716 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580814 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580883 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580906 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580926 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580945 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580961 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580982 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580982 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580997 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581021 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580095 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581002 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581149 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581222 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581242 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581265 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581283 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581315 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581333 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581337 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581357 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581368 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581377 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581434 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581456 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581476 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581494 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581514 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581532 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581549 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581566 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581582 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581597 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581625 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581645 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581661 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581679 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581689 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581694 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581782 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581787 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.581818 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582044 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582196 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582286 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582245 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582242 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582293 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582289 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582417 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582443 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582478 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582496 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582512 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582533 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584016 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584048 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584106 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584133 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584191 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584214 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584254 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584273 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584294 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584341 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584360 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584380 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584399 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584421 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584438 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584484 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584512 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5ds\" (UniqueName: \"kubernetes.io/projected/25abcf60-fe34-446b-9df8-1ed8e5102975-kube-api-access-gb5ds\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584535 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25abcf60-fe34-446b-9df8-1ed8e5102975-host\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584576 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584613 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbr6\" (UniqueName: \"kubernetes.io/projected/3d5c4ca7-33ee-4858-948f-631753eb056e-kube-api-access-nrbr6\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584633 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584652 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584671 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584693 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584710 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584750 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584772 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584791 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584808 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d5c4ca7-33ee-4858-948f-631753eb056e-hosts-file\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584839 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25abcf60-fe34-446b-9df8-1ed8e5102975-serviceca\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584881 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585002 4733 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585025 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585035 4733 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585047 4733 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585063 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585073 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585084 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585094 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.582646 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583034 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583059 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583213 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.580320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583268 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583340 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583390 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583440 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583481 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583499 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585242 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585360 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585638 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585846 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585860 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585936 4733 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.585990 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585998 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586067 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583906 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.583942 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584067 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584335 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584463 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586340 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584490 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584545 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584559 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584562 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586375 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584612 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584695 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584903 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.584988 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585016 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585043 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585140 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.585143 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.585144 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586240 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.586489 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:55.086466032 +0000 UTC m=+18.951677143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586439 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586869 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586883 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586894 4733 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586907 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586917 4733 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587002 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.587131 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:55.087121659 +0000 UTC m=+18.952332770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.586926 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587191 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587232 4733 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587245 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587272 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587282 4733 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587293 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587426 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587445 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587455 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.587551 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588204 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588127 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588222 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588353 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588368 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588400 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588412 4733 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588424 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588435 4733 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588446 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588456 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588465 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588474 4733 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588485 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588495 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588504 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588514 4733 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588523 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588533 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588542 4733 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588552 4733 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588561 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588571 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588581 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588597 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588612 4733 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588622 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588630 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588640 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588650 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588660 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588668 4733 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588677 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588686 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588697 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588706 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588716 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588724 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588733 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588742 4733 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588751 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588759 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588770 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588780 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588789 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588797 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588807 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588815 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588824 4733 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588847 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588856 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588865 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588907 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588916 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588925 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588933 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588942 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588973 4733 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588983 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.588991 4733 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589000 4733 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589150 4733 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589194 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589008 4733 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589220 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589233 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589267 4733 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.589869 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590326 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590498 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590548 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590626 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590814 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590732 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590871 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590965 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.590985 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591004 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591019 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591019 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591033 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591046 4733 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591129 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591338 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591466 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.591796 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.592960 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.595783 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.596092 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.597242 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.597197 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.597368 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.597464 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.597714 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.597729 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.597780 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:55.097763862 +0000 UTC m=+18.962974973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.598095 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.598108 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.598240 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.598317 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.598325 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.598363 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.598412 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:55.098395043 +0000 UTC m=+18.963606154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.598913 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.599335 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.599728 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.599773 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600123 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600502 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600777 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600886 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600681 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.601178 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.600404 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.602043 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.602119 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.603528 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.603691 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.603816 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.604226 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.604283 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.604426 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.604667 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.604990 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.605963 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.608557 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.608580 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.608887 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.608889 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.608982 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.609100 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.609780 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.609279 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.610127 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.610738 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.610872 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.610979 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.610937 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611045 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611133 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611173 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611195 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611315 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.611347 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.612358 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.619809 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.620689 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.621177 4733 scope.go:117] "RemoveContainer" containerID="fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.621676 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: E1206 05:43:54.622166 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.634150 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.637105 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.645092 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.646232 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.648572 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.662168 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.672042 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.679347 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691382 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691443 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691463 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d5c4ca7-33ee-4858-948f-631753eb056e-hosts-file\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691481 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25abcf60-fe34-446b-9df8-1ed8e5102975-serviceca\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691508 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5ds\" (UniqueName: \"kubernetes.io/projected/25abcf60-fe34-446b-9df8-1ed8e5102975-kube-api-access-gb5ds\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691527 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25abcf60-fe34-446b-9df8-1ed8e5102975-host\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbr6\" (UniqueName: \"kubernetes.io/projected/3d5c4ca7-33ee-4858-948f-631753eb056e-kube-api-access-nrbr6\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691620 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d5c4ca7-33ee-4858-948f-631753eb056e-hosts-file\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691663 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691680 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25abcf60-fe34-446b-9df8-1ed8e5102975-host\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692174 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692376 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25abcf60-fe34-446b-9df8-1ed8e5102975-serviceca\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.691571 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692556 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692590 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692607 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692616 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692625 4733 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692638 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692648 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692670 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692679 4733 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692689 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692699 4733 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692709 4733 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692719 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692728 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692737 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692746 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692757 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692767 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692776 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692786 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692794 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692803 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692811 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692820 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692829 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692838 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692847 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692867 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692876 4733 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692886 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692899 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692907 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692916 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692926 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692934 4733 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692943 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692952 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692962 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692975 4733 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692984 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.692993 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693002 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693011 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693020 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693031 4733 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693038 4733 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693047 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693056 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693065 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693074 4733 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693083 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693108 4733 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693118 4733 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693126 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693136 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693145 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693154 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693162 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693172 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693180 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693189 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693197 4733 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693205 4733 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693213 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693222 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693230 4733 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693238 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693246 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693255 4733 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693263 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693274 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693281 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693291 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693315 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693326 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693336 4733 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693346 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693355 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693363 4733 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693371 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693380 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693388 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693396 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693404 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693412 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693420 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693428 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693436 4733 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693444 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693451 4733 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693460 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693468 4733 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693475 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693483 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693491 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693499 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693507 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693514 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693523 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.693532 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.706756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5ds\" (UniqueName: \"kubernetes.io/projected/25abcf60-fe34-446b-9df8-1ed8e5102975-kube-api-access-gb5ds\") pod \"node-ca-pqsfd\" (UID: \"25abcf60-fe34-446b-9df8-1ed8e5102975\") " pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.711517 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbr6\" (UniqueName: \"kubernetes.io/projected/3d5c4ca7-33ee-4858-948f-631753eb056e-kube-api-access-nrbr6\") pod \"node-resolver-cnxdh\" (UID: \"3d5c4ca7-33ee-4858-948f-631753eb056e\") " pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.760707 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.765439 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.768698 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 05:43:54 crc kubenswrapper[4733]: W1206 05:43:54.772633 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-70b66ba768bd56157815a46e2288ca873403d60ab0c267cb37040df5e0f116b0 WatchSource:0}: Error finding container 70b66ba768bd56157815a46e2288ca873403d60ab0c267cb37040df5e0f116b0: Status 404 returned error can't find the container with id 70b66ba768bd56157815a46e2288ca873403d60ab0c267cb37040df5e0f116b0 Dec 06 05:43:54 crc kubenswrapper[4733]: W1206 05:43:54.780606 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-52d3012d085e48c982afbf4574d3d3dd42233a840179bf6a91abcc5561025386 WatchSource:0}: Error finding container 52d3012d085e48c982afbf4574d3d3dd42233a840179bf6a91abcc5561025386: Status 404 returned error can't find the container with id 52d3012d085e48c982afbf4574d3d3dd42233a840179bf6a91abcc5561025386 Dec 06 05:43:54 crc kubenswrapper[4733]: W1206 05:43:54.783985 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-edcbe19a5de3877e6f06f31d56a7101ca1a8aee52ff60cfc9a59b29c227cfce9 WatchSource:0}: Error finding container edcbe19a5de3877e6f06f31d56a7101ca1a8aee52ff60cfc9a59b29c227cfce9: Status 404 returned error can't find the container with id edcbe19a5de3877e6f06f31d56a7101ca1a8aee52ff60cfc9a59b29c227cfce9 Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.842848 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqsfd" Dec 06 05:43:54 crc kubenswrapper[4733]: I1206 05:43:54.842890 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnxdh" Dec 06 05:43:54 crc kubenswrapper[4733]: W1206 05:43:54.861537 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5c4ca7_33ee_4858_948f_631753eb056e.slice/crio-d7e32fe579de4d8223499aff7f58167696c6e78a18f58887d9195944d95a64d8 WatchSource:0}: Error finding container d7e32fe579de4d8223499aff7f58167696c6e78a18f58887d9195944d95a64d8: Status 404 returned error can't find the container with id d7e32fe579de4d8223499aff7f58167696c6e78a18f58887d9195944d95a64d8 Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.096655 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.096737 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.096799 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:43:56.096767604 +0000 UTC m=+19.961978716 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.096859 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.096863 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.096982 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:56.096974983 +0000 UTC m=+19.962186094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.096897 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.097093 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:56.097075992 +0000 UTC m=+19.962287093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.197961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.198008 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198140 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198163 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198177 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198205 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198240 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198255 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198224 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:56.198211358 +0000 UTC m=+20.063422468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:55 crc kubenswrapper[4733]: E1206 05:43:55.198367 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:56.19834635 +0000 UTC m=+20.063557462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.247096 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-684r5"] Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.247473 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-g7qjx"] Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.247643 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5mf9m"] Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.247655 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.247778 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.248548 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.250178 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.250930 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2gb79"] Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.251411 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.251543 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.251565 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.251732 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.251936 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252159 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252246 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252368 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252391 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252550 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252650 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.252759 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.254932 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.259605 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.259760 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.259815 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.259904 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.259980 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.260011 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.264161 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.283865 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.293113 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.302294 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.311919 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.320659 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.332342 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.344719 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.354918 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.364066 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.372710 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.382515 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.392188 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399470 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-multus-daemon-config\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399509 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399601 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zzq\" (UniqueName: \"kubernetes.io/projected/94d7ccbf-e88d-4045-8d89-633470de7aca-kube-api-access-s2zzq\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399640 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-multus\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399727 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399794 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-etc-kubernetes\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399826 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399845 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399868 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-conf-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399900 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399930 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-cnibin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399958 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.399989 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-os-release\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400017 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-k8s-cni-cncf-io\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400043 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-bin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400082 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-kubelet\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400104 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfjs\" (UniqueName: \"kubernetes.io/projected/cc59542d-ee4a-414d-b096-86716cb56db5-kube-api-access-mbfjs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400142 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-system-cni-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400160 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400178 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c5fg\" (UniqueName: \"kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400236 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400253 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-proxy-tls\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400326 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400343 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400361 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400378 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-multus-certs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400403 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400420 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-mcd-auth-proxy-config\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400437 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400451 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400470 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-os-release\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400487 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400505 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-netns\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400520 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-rootfs\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400543 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400562 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400615 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400672 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400687 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-hostroot\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400760 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400787 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-socket-dir-parent\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400849 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq86l\" (UniqueName: \"kubernetes.io/projected/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-kube-api-access-hq86l\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400915 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400939 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-cnibin\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400963 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-system-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.400979 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-cni-binary-copy\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.414435 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.430407 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.452723 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.468156 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.498999 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501424 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfjs\" (UniqueName: \"kubernetes.io/projected/cc59542d-ee4a-414d-b096-86716cb56db5-kube-api-access-mbfjs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501472 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-kubelet\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501494 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-system-cni-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501512 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501533 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c5fg\" (UniqueName: \"kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501555 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501567 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-kubelet\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501573 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-proxy-tls\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501637 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-system-cni-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501669 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501650 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501725 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501759 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501778 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-multus-certs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501796 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-mcd-auth-proxy-config\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501828 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501836 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501854 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501859 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501870 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-os-release\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501894 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501918 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-netns\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501936 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-rootfs\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.501977 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502001 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502017 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502033 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502049 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502068 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-hostroot\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502087 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502105 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-socket-dir-parent\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502120 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq86l\" (UniqueName: \"kubernetes.io/projected/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-kube-api-access-hq86l\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502138 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502155 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-cnibin\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502172 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-system-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502189 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-cni-binary-copy\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502220 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502243 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502264 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zzq\" (UniqueName: \"kubernetes.io/projected/94d7ccbf-e88d-4045-8d89-633470de7aca-kube-api-access-s2zzq\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502283 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-multus\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502315 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-multus-daemon-config\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502334 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502368 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502373 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502338 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502396 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502409 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-etc-kubernetes\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502432 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502448 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502464 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-conf-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502469 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502487 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502501 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502512 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-netns\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502523 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-etc-kubernetes\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502544 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-rootfs\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502549 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502558 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502483 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502570 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-mcd-auth-proxy-config\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502610 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-cnibin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502630 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502641 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-os-release\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502659 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502659 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-k8s-cni-cncf-io\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502677 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-k8s-cni-cncf-io\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502700 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-bin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502738 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-system-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502779 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503111 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503155 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-conf-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502461 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-os-release\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503295 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-cni-binary-copy\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503356 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-bin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502705 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-cnibin\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503394 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503441 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503473 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-cni-dir\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503491 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-cnibin\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-hostroot\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502582 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-run-multus-certs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503536 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94d7ccbf-e88d-4045-8d89-633470de7aca-os-release\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-multus-socket-dir-parent\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503807 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc59542d-ee4a-414d-b096-86716cb56db5-multus-daemon-config\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc59542d-ee4a-414d-b096-86716cb56db5-host-var-lib-cni-multus\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.502574 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.503970 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94d7ccbf-e88d-4045-8d89-633470de7aca-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.504263 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.505400 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-proxy-tls\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.508639 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.523299 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfjs\" (UniqueName: \"kubernetes.io/projected/cc59542d-ee4a-414d-b096-86716cb56db5-kube-api-access-mbfjs\") pod \"multus-684r5\" (UID: \"cc59542d-ee4a-414d-b096-86716cb56db5\") " pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.523708 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.524165 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zzq\" (UniqueName: \"kubernetes.io/projected/94d7ccbf-e88d-4045-8d89-633470de7aca-kube-api-access-s2zzq\") pod \"multus-additional-cni-plugins-5mf9m\" (UID: \"94d7ccbf-e88d-4045-8d89-633470de7aca\") " pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.527156 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c5fg\" (UniqueName: \"kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg\") pod \"ovnkube-node-2gb79\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.527674 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq86l\" (UniqueName: \"kubernetes.io/projected/b9ab6d12-6a30-4bf0-a5a1-5a661b82f448-kube-api-access-hq86l\") pod \"machine-config-daemon-g7qjx\" (UID: \"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\") " pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.534988 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.546805 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.557883 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.562785 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-684r5" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.570994 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnxdh" event={"ID":"3d5c4ca7-33ee-4858-948f-631753eb056e","Type":"ContainerStarted","Data":"d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.571034 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnxdh" event={"ID":"3d5c4ca7-33ee-4858-948f-631753eb056e","Type":"ContainerStarted","Data":"d7e32fe579de4d8223499aff7f58167696c6e78a18f58887d9195944d95a64d8"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.572849 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqsfd" event={"ID":"25abcf60-fe34-446b-9df8-1ed8e5102975","Type":"ContainerStarted","Data":"163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.572910 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqsfd" event={"ID":"25abcf60-fe34-446b-9df8-1ed8e5102975","Type":"ContainerStarted","Data":"24990e2bf294532879d840d91ea746b589d2d405af1b19af423782f5b1a8d55d"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.573651 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:43:55 crc kubenswrapper[4733]: W1206 05:43:55.574342 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc59542d_ee4a_414d_b096_86716cb56db5.slice/crio-ddcdd166ec2bfd3432ba8e3b722ec759e5b6d8a1b486d01b41de9fa4dc84fc89 WatchSource:0}: Error finding container ddcdd166ec2bfd3432ba8e3b722ec759e5b6d8a1b486d01b41de9fa4dc84fc89: Status 404 returned error can't find the container with id ddcdd166ec2bfd3432ba8e3b722ec759e5b6d8a1b486d01b41de9fa4dc84fc89 Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.576583 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.576632 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.576645 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"52d3012d085e48c982afbf4574d3d3dd42233a840179bf6a91abcc5561025386"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.577111 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.578642 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.578672 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"70b66ba768bd56157815a46e2288ca873403d60ab0c267cb37040df5e0f116b0"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.580104 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.581243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.582387 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.585102 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.585732 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.594534 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"edcbe19a5de3877e6f06f31d56a7101ca1a8aee52ff60cfc9a59b29c227cfce9"} Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.596422 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.605658 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.618566 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: W1206 05:43:55.618878 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171aa174_9338_4421_8393_9e23fbab7f1e.slice/crio-46e99ae8bab74c84d954e6100aa560275149b69e8155a7fcfa37a9ca61d66241 WatchSource:0}: Error finding container 46e99ae8bab74c84d954e6100aa560275149b69e8155a7fcfa37a9ca61d66241: Status 404 returned error can't find the container with id 46e99ae8bab74c84d954e6100aa560275149b69e8155a7fcfa37a9ca61d66241 Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.628284 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.646348 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.656297 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.669088 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.680997 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.692281 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.711223 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.732528 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.745189 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.755468 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.769008 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.778894 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.809455 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:55 crc kubenswrapper[4733]: I1206 05:43:55.844784 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:55Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.112260 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.112453 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:43:58.112430553 +0000 UTC m=+21.977641664 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.112546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.112577 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.112683 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.112727 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:58.112720195 +0000 UTC m=+21.977931296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.112759 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.112850 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:58.112830822 +0000 UTC m=+21.978041934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.213385 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.213447 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213544 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213563 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213575 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213613 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:58.213605372 +0000 UTC m=+22.078816482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213829 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213848 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213857 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.213899 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:43:58.213892048 +0000 UTC m=+22.079103160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.484230 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.484241 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.484435 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.484576 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.484719 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:43:56 crc kubenswrapper[4733]: E1206 05:43:56.484786 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.488564 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.489244 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.489947 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.490552 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.491146 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.491699 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.492280 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.492840 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.493450 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.493987 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.494514 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.495128 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.495634 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.496104 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.496616 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.497048 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.497107 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.497665 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.498056 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.501415 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.501953 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.503897 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.504505 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.504930 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.506024 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.506488 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.507461 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.508059 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.509127 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.509746 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.510800 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.511250 4733 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.511368 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.512189 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.513211 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.513812 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.514227 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.516087 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.517181 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.517724 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.518898 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.519651 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.520522 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.521131 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.522609 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.523289 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.524174 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.524728 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.525601 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.526324 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.527562 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.529053 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.530430 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.531083 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.531943 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.532481 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.533008 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.553853 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.565941 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.577254 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.589527 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.601368 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" exitCode=0 Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.601454 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.601509 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"46e99ae8bab74c84d954e6100aa560275149b69e8155a7fcfa37a9ca61d66241"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.606627 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.609238 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0" exitCode=0 Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.609337 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.609361 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerStarted","Data":"0c6b021fcc371ecc11c91554d4094399f7f8d117e686159a0ae9cc6ca8fcaf24"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.612795 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.612827 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.612838 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"c6a08e14224ad2cf79805aad13621e39074f4f7cfa20bbafe3a6a8b950ae25d2"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.616015 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerStarted","Data":"d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.616042 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerStarted","Data":"ddcdd166ec2bfd3432ba8e3b722ec759e5b6d8a1b486d01b41de9fa4dc84fc89"} Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.624029 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.638289 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.648084 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.657409 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.670128 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.683047 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.695235 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.706125 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.718872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.730589 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.739917 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.755917 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.769167 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.783790 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.796843 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.809061 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.846633 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.890335 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.924591 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:56 crc kubenswrapper[4733]: I1206 05:43:56.964501 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.065961 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.067954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.068001 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.068014 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.068113 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.075478 4733 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.075654 4733 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.076850 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.076895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.076907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.076924 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.076938 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.092472 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.095685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.095726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.095737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.095755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.095769 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.106280 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.109322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.109362 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.109371 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.109386 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.109397 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.119539 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.122361 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.122390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.122402 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.122418 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.122431 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.137547 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.141913 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.141956 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.141966 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.141983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.141995 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.151168 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: E1206 05:43:57.151277 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.152425 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.152467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.152478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.152495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.152505 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.254468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.254504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.254515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.254532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.254545 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.356493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.356534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.356544 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.356560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.356571 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.458683 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.458717 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.458727 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.458741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.458750 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.560392 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.560431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.560447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.560460 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.560470 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620824 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620870 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620880 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620889 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620897 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.620906 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.622437 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d" exitCode=0 Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.622508 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.624649 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.645912 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.662512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.662550 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.662564 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.662589 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.662605 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.670709 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.693656 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.706204 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.715700 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.723191 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.732714 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.741783 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.752795 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.762801 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.764363 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.764398 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.764407 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.764422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.764456 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.772184 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.782279 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.791273 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.804551 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.815156 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.824202 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.832830 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.846926 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.856667 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867321 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867417 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.867783 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.879071 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.889015 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.924283 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.965267 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:57Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.970282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.970342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.970356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.970376 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:57 crc kubenswrapper[4733]: I1206 05:43:57.970389 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:57Z","lastTransitionTime":"2025-12-06T05:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.006008 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.046000 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.075012 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.075058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.075069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.075085 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.075105 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.084530 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.124591 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.129943 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.130047 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.130085 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.130132 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:44:02.130108209 +0000 UTC m=+25.995319320 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.130165 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.130224 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:02.130211602 +0000 UTC m=+25.995422704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.130236 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.130298 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:02.130282966 +0000 UTC m=+25.995494077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.177385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.177423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.177436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.177455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.177468 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.230834 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.230883 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231017 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231048 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231062 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231101 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:02.231090739 +0000 UTC m=+26.096301849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231021 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231150 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231164 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.231220 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:02.231205212 +0000 UTC m=+26.096416323 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.279422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.279456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.279467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.279483 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.279506 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.381979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.382016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.382026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.382045 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.382056 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484139 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484202 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.484319 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484414 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.484470 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:43:58 crc kubenswrapper[4733]: E1206 05:43:58.484609 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484796 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.484830 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.586811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.586845 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.586855 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.586869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.586880 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.629206 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7" exitCode=0 Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.629295 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.643531 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.655551 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.664446 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.680096 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.689822 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.689852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.689861 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.689893 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.689906 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.695244 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.706111 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.717360 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.729633 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.739411 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.748539 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.756878 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.769264 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.781040 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792441 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792613 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792667 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.792679 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.895028 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.895064 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.895074 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.895089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.895100 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.997417 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.997580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.997649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.997735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:58 crc kubenswrapper[4733]: I1206 05:43:58.997806 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:58Z","lastTransitionTime":"2025-12-06T05:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.100429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.100642 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.100663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.100810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.100838 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.202810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.202843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.202854 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.202868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.202877 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.305053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.305095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.305107 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.305125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.305137 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.407823 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.407873 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.407886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.407905 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.407917 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.510622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.510657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.510666 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.510680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.510691 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.612962 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.613010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.613020 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.613042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.613054 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.636665 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.639328 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f" exitCode=0 Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.639378 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.650090 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.660449 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.672654 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.684464 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.693163 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.703368 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.713507 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.715469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.715514 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.715530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.715550 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.715572 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.722860 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.732733 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.741369 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.755690 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.766428 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.775780 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.787434 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:43:59Z is after 2025-08-24T17:21:41Z" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.818296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.818355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.818367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.818390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.818406 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.920574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.920618 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.920628 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.920646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:43:59 crc kubenswrapper[4733]: I1206 05:43:59.920661 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:43:59Z","lastTransitionTime":"2025-12-06T05:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.024085 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.024130 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.024141 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.024159 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.024175 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.127247 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.127289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.127298 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.127328 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.127342 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.229599 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.229643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.229655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.229668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.229679 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.331464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.331522 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.331537 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.331564 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.331576 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.433878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.433919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.433929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.433948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.433960 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.483948 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.483979 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.483949 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:00 crc kubenswrapper[4733]: E1206 05:44:00.484084 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:00 crc kubenswrapper[4733]: E1206 05:44:00.484188 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:00 crc kubenswrapper[4733]: E1206 05:44:00.484275 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.535624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.535667 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.535677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.535695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.535707 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.638038 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.638073 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.638082 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.638094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.638107 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.645284 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc" exitCode=0 Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.645341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.660632 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.674137 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.685022 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.695876 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.705376 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.715384 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.722952 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.730478 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.739021 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.740842 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.740883 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.740910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.740931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.740942 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.748773 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.758058 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.766078 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.779105 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.788687 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:00Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.843607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.843643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.843655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.843671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.843683 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.946512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.946542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.946562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.946580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:00 crc kubenswrapper[4733]: I1206 05:44:00.946594 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:00Z","lastTransitionTime":"2025-12-06T05:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.048850 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.048887 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.048897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.048911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.048923 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.150791 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.150830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.150839 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.150852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.150861 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.252746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.252789 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.252800 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.252820 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.252831 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.355266 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.355325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.355339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.355355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.355366 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.457330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.457355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.457365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.457378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.457388 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.559169 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.559443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.559453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.559467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.559477 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.651333 4733 generic.go:334] "Generic (PLEG): container finished" podID="94d7ccbf-e88d-4045-8d89-633470de7aca" containerID="16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64" exitCode=0 Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.651338 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerDied","Data":"16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.657508 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.657802 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.661426 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.661455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.661464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.661477 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.661487 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.665866 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.675978 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.678073 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.687724 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.700821 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.709781 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.724505 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.735336 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.745090 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.758337 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.763280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.763334 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.763345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.763429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.763445 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.769190 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.778435 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.787042 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.795816 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.805617 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.814594 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.823470 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.831170 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.844161 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.853351 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.861752 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.865195 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.865232 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.865242 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.865262 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.865272 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.871710 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.881418 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.890891 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.899623 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.906909 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.913956 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.923848 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.931525 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:01Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.967078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.967112 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.967124 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.967138 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:01 crc kubenswrapper[4733]: I1206 05:44:01.967147 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:01Z","lastTransitionTime":"2025-12-06T05:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.069083 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.069116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.069130 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.069146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.069156 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.166682 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.166742 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.166787 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.166768584 +0000 UTC m=+34.031979695 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.166846 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.166925 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.166908537 +0000 UTC m=+34.032119658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.166865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.166948 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.166984 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.166977595 +0000 UTC m=+34.032188706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.171189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.171224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.171236 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.171251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.171262 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.267777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.267815 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.267909 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.267922 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.267933 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.267969 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.267961387 +0000 UTC m=+34.133172498 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.268036 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.268071 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.268088 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.268147 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.268129772 +0000 UTC m=+34.133340893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.272922 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.272955 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.272965 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.272982 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.272993 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.374976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.375010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.375020 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.375034 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.375043 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.476670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.476703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.476712 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.476725 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.476739 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.484204 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.484224 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.484230 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.484336 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.484401 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:02 crc kubenswrapper[4733]: E1206 05:44:02.484476 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.578429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.578461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.578470 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.578483 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.578495 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.664068 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" event={"ID":"94d7ccbf-e88d-4045-8d89-633470de7aca","Type":"ContainerStarted","Data":"dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.664116 4733 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.667619 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.677910 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.680188 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.680219 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.680228 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.680243 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.680253 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.683906 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.686710 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.694807 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.703052 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.712057 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.721123 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.729662 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.738116 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.746648 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.753926 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.767142 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.776414 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.782781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.782812 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.782823 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.782837 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.782849 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.786952 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.797703 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.806630 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.816454 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.824798 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.834641 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.849866 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.865117 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.875129 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.884580 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.885010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.885040 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.885050 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.885070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.885082 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.895079 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.902997 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.915526 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.924477 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.932876 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.943098 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:02Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.986798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.986846 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.986856 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.986868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:02 crc kubenswrapper[4733]: I1206 05:44:02.986884 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:02Z","lastTransitionTime":"2025-12-06T05:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.088458 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.088493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.088503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.088515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.088524 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.191279 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.191317 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.191328 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.191340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.191349 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.294182 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.294210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.294219 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.294235 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.294245 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.396349 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.396380 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.396389 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.396403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.396415 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.498911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.498951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.498961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.498976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.498987 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.601408 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.601461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.601471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.601486 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.601496 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.667998 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/0.log" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.670267 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591" exitCode=1 Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.670350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.670846 4733 scope.go:117] "RemoveContainer" containerID="345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.687161 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:03Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1206 05:44:03.304430 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 05:44:03.304508 6054 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 05:44:03.304756 6054 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 05:44:03.304782 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 05:44:03.304790 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 05:44:03.304824 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 05:44:03.304828 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 05:44:03.304849 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 05:44:03.304850 6054 factory.go:656] Stopping watch factory\\\\nI1206 05:44:03.304860 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 05:44:03.304865 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 05:44:03.304870 6054 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.697233 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.703287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.703339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.703350 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.703366 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.703375 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.707593 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.717239 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.727038 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.736919 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.749548 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.759025 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.768820 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.777445 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.786295 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.794437 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804019 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804887 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804932 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804949 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.804961 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.812736 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:03Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.907122 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.907159 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.907171 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.907186 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:03 crc kubenswrapper[4733]: I1206 05:44:03.907195 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:03Z","lastTransitionTime":"2025-12-06T05:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.009405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.009456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.009468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.009484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.009494 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.112108 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.112148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.112158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.112174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.112189 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.214065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.214106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.214115 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.214132 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.214142 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.315956 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.316229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.316239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.316257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.316268 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.418147 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.418178 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.418186 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.418198 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.418208 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.484345 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:04 crc kubenswrapper[4733]: E1206 05:44:04.484458 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.484499 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.484531 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:04 crc kubenswrapper[4733]: E1206 05:44:04.484653 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:04 crc kubenswrapper[4733]: E1206 05:44:04.484765 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.520145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.520181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.520191 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.520205 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.520217 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.622068 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.622094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.622104 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.622114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.622123 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.673662 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/1.log" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.674121 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/0.log" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.676152 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3" exitCode=1 Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.676186 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.676237 4733 scope.go:117] "RemoveContainer" containerID="345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.677481 4733 scope.go:117] "RemoveContainer" containerID="15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3" Dec 06 05:44:04 crc kubenswrapper[4733]: E1206 05:44:04.680940 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.691292 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.700769 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.711165 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.718448 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.724030 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.724059 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.724069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.724080 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.724100 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.727588 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.736362 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.744692 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.751636 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.760584 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.769079 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.778486 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.787612 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.795217 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.808107 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://345d07bba8fa5983992d4b09291a48c8b22337706229e2c7a3f4c15cad5d4591\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:03Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1206 05:44:03.304430 6054 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 05:44:03.304508 6054 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 05:44:03.304756 6054 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 05:44:03.304782 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 05:44:03.304790 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 05:44:03.304824 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 05:44:03.304828 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 05:44:03.304849 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 05:44:03.304850 6054 factory.go:656] Stopping watch factory\\\\nI1206 05:44:03.304860 6054 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 05:44:03.304865 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 05:44:03.304870 6054 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:04Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.825412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.825443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.825453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.825468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.825480 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.927536 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.927566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.927575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.927588 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:04 crc kubenswrapper[4733]: I1206 05:44:04.927597 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:04Z","lastTransitionTime":"2025-12-06T05:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.029461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.029486 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.029495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.029506 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.029523 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.131648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.131680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.131689 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.131700 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.131710 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.233556 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.233585 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.233594 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.233604 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.233611 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.335746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.335773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.335782 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.335792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.335801 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.437240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.437274 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.437285 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.437320 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.437331 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.538574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.538605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.538622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.538636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.538646 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.640236 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.640267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.640276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.640289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.640297 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.680031 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/1.log" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.682680 4733 scope.go:117] "RemoveContainer" containerID="15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3" Dec 06 05:44:05 crc kubenswrapper[4733]: E1206 05:44:05.682821 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.692945 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.703041 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.710549 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.723764 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.732775 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.741928 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.741965 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.741977 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.741992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.742004 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.742462 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.752839 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.760185 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.772202 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.780964 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.788758 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.795336 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.805060 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.813461 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:05Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.844049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.844086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.844095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.844113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.844125 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.946413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.946449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.946465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.946482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:05 crc kubenswrapper[4733]: I1206 05:44:05.946496 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:05Z","lastTransitionTime":"2025-12-06T05:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.048651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.048682 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.048696 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.048714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.048729 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.151020 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.151057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.151068 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.151084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.151095 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.252614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.252648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.252658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.252681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.252692 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.354912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.354947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.354957 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.354970 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.354981 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.456913 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.456946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.456957 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.456969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.456979 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.484555 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.484580 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:06 crc kubenswrapper[4733]: E1206 05:44:06.484647 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:06 crc kubenswrapper[4733]: E1206 05:44:06.484780 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.484816 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:06 crc kubenswrapper[4733]: E1206 05:44:06.484883 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.495234 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.503540 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.512056 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.520282 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.527569 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.539652 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.551834 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.558709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.558734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.558746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.558761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.558776 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.560238 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.569364 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.575886 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.584076 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.593928 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.602369 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.609266 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.661482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.661526 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.661535 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.661546 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.661556 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.763669 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.763705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.763715 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.763731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.763740 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.865385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.865413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.865421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.865431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.865440 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.967566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.967607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.967616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.967630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:06 crc kubenswrapper[4733]: I1206 05:44:06.967640 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:06Z","lastTransitionTime":"2025-12-06T05:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.069562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.069681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.069694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.069705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.069714 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.171398 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.171426 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.171435 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.171447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.171455 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.273836 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.273869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.273878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.273910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.273922 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.376237 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.376273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.376282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.376297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.376325 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.458075 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.458105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.458114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.458128 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.458138 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.467592 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.470378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.470421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.470432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.470450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.470459 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.479161 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.481164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.481190 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.481199 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.481210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.481218 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.490632 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.493489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.493577 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.493603 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.493614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.493620 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.502255 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.504733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.504764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.504774 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.504788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.504797 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.513622 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: E1206 05:44:07.513723 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.514792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.514813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.514822 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.514833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.514845 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.531381 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk"] Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.531777 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.532995 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.533113 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.541082 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.549995 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.557969 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.571578 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.579978 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.588224 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.598147 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.609706 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.613520 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.613563 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6vd\" (UniqueName: \"kubernetes.io/projected/e24a9e84-0151-4204-9391-510da9049b58-kube-api-access-sh6vd\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.613595 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.613773 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e24a9e84-0151-4204-9391-510da9049b58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.617011 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.617051 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.617061 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.617078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.617088 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.619074 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.628649 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.635906 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.643231 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.652935 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.662151 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.671026 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:07Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.714758 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.714814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6vd\" (UniqueName: \"kubernetes.io/projected/e24a9e84-0151-4204-9391-510da9049b58-kube-api-access-sh6vd\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.714843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.714884 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e24a9e84-0151-4204-9391-510da9049b58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.715641 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.715739 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e24a9e84-0151-4204-9391-510da9049b58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.718993 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.719037 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.719048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.719064 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.719074 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.719337 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e24a9e84-0151-4204-9391-510da9049b58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.727518 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6vd\" (UniqueName: \"kubernetes.io/projected/e24a9e84-0151-4204-9391-510da9049b58-kube-api-access-sh6vd\") pod \"ovnkube-control-plane-749d76644c-q2ktk\" (UID: \"e24a9e84-0151-4204-9391-510da9049b58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.821086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.821117 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.821128 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.821142 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.821152 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.842409 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" Dec 06 05:44:07 crc kubenswrapper[4733]: W1206 05:44:07.852553 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24a9e84_0151_4204_9391_510da9049b58.slice/crio-579e5cfaaf23aebfd8438b72b056f7c60b427ac2ecee1a5dbc52467500e15340 WatchSource:0}: Error finding container 579e5cfaaf23aebfd8438b72b056f7c60b427ac2ecee1a5dbc52467500e15340: Status 404 returned error can't find the container with id 579e5cfaaf23aebfd8438b72b056f7c60b427ac2ecee1a5dbc52467500e15340 Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.922695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.922821 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.922886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.922968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:07 crc kubenswrapper[4733]: I1206 05:44:07.923037 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:07Z","lastTransitionTime":"2025-12-06T05:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.025252 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.025282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.025291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.025320 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.025339 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.128013 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.128048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.128057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.128069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.128081 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.230665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.230698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.230708 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.230723 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.230733 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.332638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.332677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.332687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.332705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.332717 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.434438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.434479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.434496 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.434511 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.434521 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.461167 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.471405 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.480672 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.484348 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.484404 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.484432 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.484401 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.484527 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.484576 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.489909 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.502468 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.511120 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.519098 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.528768 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536257 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536364 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536389 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536398 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.536422 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.545384 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.554175 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.562531 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.572773 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.579458 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.583867 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8fw28"] Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.584261 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.584338 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.588284 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.596412 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.604722 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.613382 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.620629 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.627632 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.636190 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.638495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.638524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.638534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.638547 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.638558 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.643188 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.651952 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.659582 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.667883 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.675953 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.683701 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.691949 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" event={"ID":"e24a9e84-0151-4204-9391-510da9049b58","Type":"ContainerStarted","Data":"1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.691978 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" event={"ID":"e24a9e84-0151-4204-9391-510da9049b58","Type":"ContainerStarted","Data":"aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.691990 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" event={"ID":"e24a9e84-0151-4204-9391-510da9049b58","Type":"ContainerStarted","Data":"579e5cfaaf23aebfd8438b72b056f7c60b427ac2ecee1a5dbc52467500e15340"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.698144 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.706692 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.714806 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.723608 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p7c\" (UniqueName: \"kubernetes.io/projected/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-kube-api-access-47p7c\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.723650 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.725135 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.732862 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740287 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740338 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740462 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.740472 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.752984 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.760976 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.769512 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.778715 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.786761 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.795046 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.803017 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.811478 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.819455 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.824537 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47p7c\" (UniqueName: \"kubernetes.io/projected/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-kube-api-access-47p7c\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.824597 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.824675 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:08 crc kubenswrapper[4733]: E1206 05:44:08.824721 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:09.324710941 +0000 UTC m=+33.189922053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.827456 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.838915 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p7c\" (UniqueName: \"kubernetes.io/projected/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-kube-api-access-47p7c\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.840222 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.842447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.842554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.842612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.842677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.842735 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.867705 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.883028 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.894455 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.904734 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:08Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.945163 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.945278 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.945288 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.945321 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:08 crc kubenswrapper[4733]: I1206 05:44:08.945332 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:08Z","lastTransitionTime":"2025-12-06T05:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.047634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.047672 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.047681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.047695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.047705 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.150140 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.150172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.150184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.150196 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.150205 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.252269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.252298 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.252329 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.252343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.252355 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.329469 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:09 crc kubenswrapper[4733]: E1206 05:44:09.329587 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:09 crc kubenswrapper[4733]: E1206 05:44:09.329645 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:10.329632413 +0000 UTC m=+34.194843524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.353931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.353960 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.353969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.353984 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.353995 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.456595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.456626 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.456638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.456656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.456665 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.558654 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.558689 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.558700 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.558712 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.558721 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.660540 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.660582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.660592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.660607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.660620 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.762469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.762512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.762521 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.762535 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.762544 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.864272 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.864359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.864369 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.864379 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.864387 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.965721 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.965750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.965797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.965811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:09 crc kubenswrapper[4733]: I1206 05:44:09.965820 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:09Z","lastTransitionTime":"2025-12-06T05:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.067246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.067282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.067294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.067340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.067354 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.168879 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.168908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.168921 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.168934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.168940 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.237247 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.237351 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.237383 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.237447 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.237496 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:26.237483228 +0000 UTC m=+50.102694338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.237699 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.237774 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:44:26.237751029 +0000 UTC m=+50.102962150 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.237881 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:26.237859953 +0000 UTC m=+50.103071064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.270759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.270786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.270795 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.270806 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.270813 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.338286 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.338415 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.338516 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338572 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338627 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:12.338614666 +0000 UTC m=+36.203825776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338628 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338649 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338659 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338692 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:26.338680309 +0000 UTC m=+50.203891419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338842 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338905 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.338954 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.339036 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:26.339027709 +0000 UTC m=+50.204238820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.373048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.373142 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.373221 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.373277 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.373357 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.475729 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.475763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.475773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.475788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.475797 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.484067 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.484081 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.484088 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.484172 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.484297 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.484402 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.484530 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:10 crc kubenswrapper[4733]: E1206 05:44:10.484622 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.577316 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.577339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.577348 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.577359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.577367 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.679682 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.679711 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.679722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.679733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.679742 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.781598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.781635 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.781644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.781657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.781668 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.883706 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.883768 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.883780 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.883793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.883805 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.985493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.985515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.985524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.985538 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:10 crc kubenswrapper[4733]: I1206 05:44:10.985547 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:10Z","lastTransitionTime":"2025-12-06T05:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.087213 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.087238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.087248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.087260 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.087268 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.189416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.189439 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.189449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.189459 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.189479 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.291598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.291635 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.291647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.291662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.291672 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.393835 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.393861 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.393869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.393881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.393888 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.495290 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.495345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.495355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.495364 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.495373 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.597343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.597383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.597394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.597407 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.597418 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.698534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.698566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.698577 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.698587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.698595 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.800478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.800516 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.800526 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.800540 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.800549 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.902945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.902980 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.902990 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.903002 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:11 crc kubenswrapper[4733]: I1206 05:44:11.903010 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:11Z","lastTransitionTime":"2025-12-06T05:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.005678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.005712 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.005721 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.005735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.005745 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.107566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.107592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.107600 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.107612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.107620 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.209297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.209339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.209348 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.209362 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.209371 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.310912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.310948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.310958 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.310972 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.310982 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.355532 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.355627 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.355670 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:16.355658204 +0000 UTC m=+40.220869315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.412424 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.412466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.412490 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.412504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.412512 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.484247 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.484298 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.484349 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.484486 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.484498 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.484558 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.484589 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:12 crc kubenswrapper[4733]: E1206 05:44:12.484630 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.514701 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.514727 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.514736 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.514746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.514754 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.616830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.616869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.616881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.616906 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.616918 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.719050 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.719081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.719091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.719102 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.719112 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.820551 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.820580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.820591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.820603 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.820612 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.922057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.922084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.922095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.922105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:12 crc kubenswrapper[4733]: I1206 05:44:12.922112 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:12Z","lastTransitionTime":"2025-12-06T05:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.023884 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.023916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.023930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.023940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.023950 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.125533 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.125553 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.125560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.125568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.125574 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.226947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.227035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.227090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.227155 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.227216 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.329343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.329471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.329524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.329571 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.329630 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.431548 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.431584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.431595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.431611 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.431622 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.533378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.533412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.533421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.533432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.533450 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.635298 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.635359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.635369 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.635385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.635396 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.737467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.738539 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.738549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.738561 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.738570 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.840766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.840882 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.840948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.841018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.841081 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.943158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.943196 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.943205 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.943220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:13 crc kubenswrapper[4733]: I1206 05:44:13.943231 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:13Z","lastTransitionTime":"2025-12-06T05:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.046125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.046150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.046162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.046173 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.046181 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.148267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.148295 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.148328 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.148340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.148348 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.250341 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.250383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.250392 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.250407 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.250418 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.352341 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.352397 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.352407 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.352422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.352432 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.454476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.454514 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.454523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.454537 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.454549 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.484544 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.484590 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.484620 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:14 crc kubenswrapper[4733]: E1206 05:44:14.484684 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.484693 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:14 crc kubenswrapper[4733]: E1206 05:44:14.484796 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:14 crc kubenswrapper[4733]: E1206 05:44:14.484878 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:14 crc kubenswrapper[4733]: E1206 05:44:14.484967 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.556810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.556838 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.556849 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.556860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.556868 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.658397 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.658444 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.658456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.658469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.658479 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.759915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.760017 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.760032 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.760044 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.760054 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.861750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.861780 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.861805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.861829 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.861841 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.963621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.963651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.963660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.963669 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:14 crc kubenswrapper[4733]: I1206 05:44:14.963678 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:14Z","lastTransitionTime":"2025-12-06T05:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.065603 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.065651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.065660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.065677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.065686 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.167557 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.167587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.167596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.167624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.167634 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.269184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.269210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.269219 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.269230 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.269239 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.371581 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.371626 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.371636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.371649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.371658 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.473078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.473108 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.473117 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.473148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.473155 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.575410 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.575453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.575464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.575479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.575488 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.677405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.677441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.677449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.677459 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.677469 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.778848 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.778914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.778929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.778965 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.778978 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.881598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.881630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.881659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.881678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.881688 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.984271 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.984295 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.984324 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.984336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:15 crc kubenswrapper[4733]: I1206 05:44:15.984344 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:15Z","lastTransitionTime":"2025-12-06T05:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.086082 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.086119 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.086128 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.086157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.086166 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.187615 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.187656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.187668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.187682 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.187693 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.268249 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.268924 4733 scope.go:117] "RemoveContainer" containerID="15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.289140 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.289350 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.289359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.289373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.289382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.387883 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.388822 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.388884 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:24.388869781 +0000 UTC m=+48.254080892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.391463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.391504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.391530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.391545 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.391556 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.484284 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.484398 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.484556 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.484668 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.484738 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.484784 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.484809 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:16 crc kubenswrapper[4733]: E1206 05:44:16.484857 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.493294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.493350 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.493360 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.493374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.493385 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.497939 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.512146 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.530363 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.543500 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.554856 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.563211 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.573390 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.583332 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.592605 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.594734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.594758 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.594770 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.594783 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.594794 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.601582 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.614559 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.623052 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.632264 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.642458 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.650840 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.660035 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.696837 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.696902 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.696914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.696930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.696940 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.714995 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/1.log" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.717274 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.717767 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.730513 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.739670 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.750990 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.759121 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.766663 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.772998 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.780804 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.790872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.799166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.799194 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.799204 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.799220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.799230 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.800946 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.809012 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.819332 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.828895 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.840802 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.850254 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.858726 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.872127 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:16Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.901377 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.901406 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.901415 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.901438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:16 crc kubenswrapper[4733]: I1206 05:44:16.901451 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:16Z","lastTransitionTime":"2025-12-06T05:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.003676 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.003728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.003738 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.003754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.003780 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.105777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.105827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.105836 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.105848 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.105857 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.207886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.207913 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.207922 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.207935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.207944 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.309989 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.310026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.310035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.310049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.310059 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.411286 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.411334 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.411345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.411359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.411369 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.513277 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.513332 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.513341 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.513352 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.513361 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.615323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.615353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.615364 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.615375 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.615384 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.717287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.717325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.717334 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.717345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.717353 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.720267 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/2.log" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.720748 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/1.log" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.727283 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" exitCode=1 Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.727336 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.727370 4733 scope.go:117] "RemoveContainer" containerID="15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.727839 4733 scope.go:117] "RemoveContainer" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.728270 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.740877 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.750428 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.758572 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.772482 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15c143f8a71272e7b8696f98c23e481c8827d6ece0e03e51d87aacd14c888dd3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:04Z\\\",\\\"message\\\":\\\"88018 6181 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 05:44:04.288204 6181 services_controller.go:443] Built service openshift-operator-lifecycle-manager/packageserver-service LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.153\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:5443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1206 05:44:04.288216 6181 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 05:44:04.288255 6181 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.781285 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.789835 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.799610 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.807078 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.814106 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.819566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.819592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.819603 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.819616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.819625 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.822652 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.830876 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.838745 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.845382 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854001 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854480 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.854731 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.863037 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.864957 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.865565 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.865593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.865602 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.865619 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.865628 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.873818 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.873991 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.876502 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.876530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.876538 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.876547 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.876554 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.884971 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.887289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.887393 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.887468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.887544 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.887606 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.895739 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.897885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.897919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.897929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.897943 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.897953 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.906690 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:17Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:17 crc kubenswrapper[4733]: E1206 05:44:17.906803 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.921152 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.921179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.921187 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.921197 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:17 crc kubenswrapper[4733]: I1206 05:44:17.921206 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:17Z","lastTransitionTime":"2025-12-06T05:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.023127 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.023167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.023177 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.023193 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.023203 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.124886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.124912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.124921 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.124935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.124944 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.226945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.226985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.226996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.227007 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.227014 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.328991 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.329014 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.329024 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.329034 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.329041 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.430730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.430831 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.430901 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.430967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.431023 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.484510 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.484597 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.484622 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.484597 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:18 crc kubenswrapper[4733]: E1206 05:44:18.484798 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:18 crc kubenswrapper[4733]: E1206 05:44:18.484874 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:18 crc kubenswrapper[4733]: E1206 05:44:18.484988 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:18 crc kubenswrapper[4733]: E1206 05:44:18.485062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.532413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.532448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.532466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.532476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.532484 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.634035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.634072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.634085 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.634100 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.634108 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.731266 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/2.log" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.734117 4733 scope.go:117] "RemoveContainer" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" Dec 06 05:44:18 crc kubenswrapper[4733]: E1206 05:44:18.734239 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.735014 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.735036 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.735044 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.735054 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.735061 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.744322 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.754709 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.763637 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.773118 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.780848 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.797123 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.805738 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.813649 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.822769 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.829703 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.836166 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.837667 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.837700 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.837709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.837722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.837731 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.843196 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.851648 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.859572 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.866566 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.873273 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:18Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.939324 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.939469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.939480 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.939495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:18 crc kubenswrapper[4733]: I1206 05:44:18.939506 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:18Z","lastTransitionTime":"2025-12-06T05:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.041391 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.041422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.041431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.041445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.041453 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.143513 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.143547 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.143556 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.143569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.143578 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.245323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.245591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.245668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.245742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.245802 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.347538 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.347557 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.347567 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.347577 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.347585 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.449388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.449530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.449591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.449644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.449702 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.552035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.552077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.552089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.552105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.552115 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.654257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.654281 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.654289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.654299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.654318 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.756118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.756175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.756184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.756200 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.756209 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.858155 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.858182 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.858191 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.858202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.858211 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.960272 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.960333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.960345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.960360 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:19 crc kubenswrapper[4733]: I1206 05:44:19.960369 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:19Z","lastTransitionTime":"2025-12-06T05:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.062347 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.062387 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.062396 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.062420 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.062432 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.164204 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.164240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.164249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.164262 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.164271 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.266510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.266549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.266558 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.266568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.266575 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.368241 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.368267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.368275 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.368288 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.368296 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.469878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.469920 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.469930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.469944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.469952 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.484244 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.484374 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.484453 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:20 crc kubenswrapper[4733]: E1206 05:44:20.484447 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.484468 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:20 crc kubenswrapper[4733]: E1206 05:44:20.484573 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:20 crc kubenswrapper[4733]: E1206 05:44:20.484667 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:20 crc kubenswrapper[4733]: E1206 05:44:20.484821 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.572536 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.572671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.572736 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.572798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.572851 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.674401 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.674441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.674452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.674465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.674474 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.775552 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.775572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.775582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.775591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.775600 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.877553 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.877598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.877610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.877622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.877632 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.979657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.979695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.979706 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.979720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:20 crc kubenswrapper[4733]: I1206 05:44:20.979730 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:20Z","lastTransitionTime":"2025-12-06T05:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.081416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.081445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.081453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.081464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.081473 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.183098 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.183134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.183144 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.183158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.183167 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.285507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.285535 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.285543 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.285555 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.285564 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.387139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.387169 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.387179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.387189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.387197 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.488388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.488436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.488447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.488458 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.488466 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.590549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.590598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.590608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.590621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.590630 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.692134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.692512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.692580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.692656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.692715 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.794620 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.794646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.794655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.794665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.794672 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.896798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.897053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.897113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.897189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.897247 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.998878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.999027 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.999092 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.999153 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:21 crc kubenswrapper[4733]: I1206 05:44:21.999207 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:21Z","lastTransitionTime":"2025-12-06T05:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.100920 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.100963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.100973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.100983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.100992 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.203107 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.203139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.203148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.203160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.203168 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.305060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.305091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.305099 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.305109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.305117 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.407063 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.407094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.407104 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.407114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.407123 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.484243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.484294 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.484372 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:22 crc kubenswrapper[4733]: E1206 05:44:22.484363 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.484419 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:22 crc kubenswrapper[4733]: E1206 05:44:22.484497 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:22 crc kubenswrapper[4733]: E1206 05:44:22.484789 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:22 crc kubenswrapper[4733]: E1206 05:44:22.484854 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.508686 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.508724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.508733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.508746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.508756 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.610854 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.610908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.610920 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.610931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.610938 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.713125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.713165 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.713174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.713202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.713210 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.814690 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.814825 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.814891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.814954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.815015 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.916809 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.916899 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.916962 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.917013 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.917076 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:22Z","lastTransitionTime":"2025-12-06T05:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.988982 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 05:44:22 crc kubenswrapper[4733]: I1206 05:44:22.995493 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.000609 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:22Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.008896 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018058 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018352 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.018360 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.026802 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.033931 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.045959 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.058421 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.069162 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.089585 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.101468 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.113389 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.120598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.120630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.120639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.120653 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.120661 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.123079 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.131022 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.138878 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.149042 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.156732 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:23Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.222974 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.223001 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.223010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.223022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.223030 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.325166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.325195 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.325202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.325212 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.325220 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.427212 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.427246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.427261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.427279 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.427292 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.528577 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.528612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.528621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.528636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.528647 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.630925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.630960 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.630969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.630981 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.630990 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.733072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.733103 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.733111 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.733121 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.733129 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.835422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.835455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.835463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.835476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.835484 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.937523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.937553 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.937562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.937574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:23 crc kubenswrapper[4733]: I1206 05:44:23.937583 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:23Z","lastTransitionTime":"2025-12-06T05:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.040066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.040101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.040111 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.040124 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.040133 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.141579 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.141597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.141604 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.141614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.141622 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.243377 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.243454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.243463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.243476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.243486 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.345716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.345755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.345764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.345777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.345786 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.447231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.447261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.447270 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.447346 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.447359 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.454798 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.454890 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.454941 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:44:40.454928762 +0000 UTC m=+64.320139873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.484495 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.484595 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.484756 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.484875 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.484917 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.485089 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.485279 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:24 crc kubenswrapper[4733]: E1206 05:44:24.485394 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.549096 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.549146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.549154 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.549165 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.549173 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.650340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.650378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.650390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.650415 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.650425 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.751384 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.751426 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.751440 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.751452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.751463 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.853436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.853568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.853632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.853715 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.853776 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.955876 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.955907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.955917 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.955942 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:24 crc kubenswrapper[4733]: I1206 05:44:24.955951 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:24Z","lastTransitionTime":"2025-12-06T05:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.058537 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.058567 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.058580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.058592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.058602 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.160506 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.160538 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.160548 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.160562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.160572 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.262145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.262174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.262184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.262196 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.262203 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.364045 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.364071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.364079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.364090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.364099 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.466058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.466087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.466096 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.466105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.466113 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.568086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.568117 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.568127 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.568157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.568165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.669610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.669640 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.669649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.669662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.669671 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.771033 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.771062 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.771071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.771081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.771088 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.872319 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.872354 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.872363 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.872376 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.872387 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.974721 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.974753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.974762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.974783 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:25 crc kubenswrapper[4733]: I1206 05:44:25.974792 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:25Z","lastTransitionTime":"2025-12-06T05:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.076591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.076616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.076625 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.076636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.076645 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.179133 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.179159 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.179167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.179179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.179189 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.268863 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.268939 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.269000 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:44:58.268981876 +0000 UTC m=+82.134192987 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.269048 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.269091 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:58.269080391 +0000 UTC m=+82.134291492 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.269093 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.269124 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:58.269116399 +0000 UTC m=+82.134327510 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.269047 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.281446 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.281476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.281484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.281495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.281506 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.370271 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.370324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370431 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370450 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370461 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370482 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370517 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370535 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370498 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:58.370487903 +0000 UTC m=+82.235699013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.370604 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:44:58.370587299 +0000 UTC m=+82.235798430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.383582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.383612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.383620 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.383632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.383640 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.484152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.484241 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.484370 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.484438 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.484487 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.484532 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.484612 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:26 crc kubenswrapper[4733]: E1206 05:44:26.484751 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.485694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.485742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.485752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.486158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.486194 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.495910 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.505272 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.513914 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.527821 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.537161 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.546576 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.558658 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.566246 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.574239 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.582601 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.588594 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.588633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.588645 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.588657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.588666 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.594991 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.602718 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.609484 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.616225 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.622894 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.631596 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.639707 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:26Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.689916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.689943 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.689962 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.689983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.690001 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.792240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.792297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.792322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.792335 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.792345 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.893950 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.893983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.893993 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.894005 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.894013 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.996207 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.996245 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.996255 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.996270 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:26 crc kubenswrapper[4733]: I1206 05:44:26.996280 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:26Z","lastTransitionTime":"2025-12-06T05:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.098381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.098423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.098452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.098466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.098475 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.200052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.200091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.200102 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.200116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.200126 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.301941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.302000 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.302009 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.302031 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.302045 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.404154 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.404189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.404200 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.404214 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.404227 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.506254 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.506323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.506336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.506351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.506361 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.607593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.607639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.607649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.607660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.607669 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.709625 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.709662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.709674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.709687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.709697 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.811108 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.811150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.811160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.811174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.811183 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.913412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.913435 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.913443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.913471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.913479 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.953269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.953357 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.953374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.953404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.953419 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: E1206 05:44:27.963472 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:27Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.967706 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.967746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.967758 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.967774 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.967788 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: E1206 05:44:27.977952 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:27Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.980404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.980487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.980558 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.980614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.980662 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:27 crc kubenswrapper[4733]: E1206 05:44:27.989421 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:27Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.992269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.992373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.992472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.992532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:27 crc kubenswrapper[4733]: I1206 05:44:27.992599 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:27Z","lastTransitionTime":"2025-12-06T05:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.001233 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:27Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.003571 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.003647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.003699 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.003761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.003812 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.011587 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:28Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.011695 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.015530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.015555 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.015564 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.015576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.015585 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.117157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.117276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.117353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.117432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.117494 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.220489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.220542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.220550 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.220560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.220567 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.323026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.323061 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.323071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.323088 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.323099 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.425099 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.425151 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.425161 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.425178 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.425189 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.483953 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.483990 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.484066 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.484064 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.484149 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.484296 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.484369 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:28 crc kubenswrapper[4733]: E1206 05:44:28.484521 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.526473 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.526497 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.526505 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.526515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.526724 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.628947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.628981 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.628992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.629006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.629017 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.730731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.730778 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.730792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.730809 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.730821 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.832819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.832869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.832877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.832887 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.832895 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.934445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.934479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.934488 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.934500 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:28 crc kubenswrapper[4733]: I1206 05:44:28.934509 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:28Z","lastTransitionTime":"2025-12-06T05:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.036456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.036484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.036493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.036503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.036511 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.138457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.138487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.138495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.138525 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.138533 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.240904 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.240930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.240940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.240972 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.240980 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.343608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.343636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.343644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.343658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.343667 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.445337 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.445365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.445374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.445408 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.445417 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.484628 4733 scope.go:117] "RemoveContainer" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" Dec 06 05:44:29 crc kubenswrapper[4733]: E1206 05:44:29.484749 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.547135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.547165 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.547173 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.547185 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.547194 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.649412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.649441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.649449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.649462 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.649471 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.751114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.751167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.751176 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.751188 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.751198 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.853548 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.853583 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.853592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.853605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.853614 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.955477 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.955512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.955523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.955534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:29 crc kubenswrapper[4733]: I1206 05:44:29.955544 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:29Z","lastTransitionTime":"2025-12-06T05:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.057749 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.057780 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.057792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.057803 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.057813 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.160089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.160117 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.160125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.160135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.160143 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.261735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.261785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.261800 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.261816 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.261831 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.364086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.364132 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.364160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.364178 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.364188 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.466089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.466115 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.466125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.466137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.466145 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.484134 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.484166 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.484166 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.484283 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:30 crc kubenswrapper[4733]: E1206 05:44:30.484274 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:30 crc kubenswrapper[4733]: E1206 05:44:30.484428 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:30 crc kubenswrapper[4733]: E1206 05:44:30.484496 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:30 crc kubenswrapper[4733]: E1206 05:44:30.484603 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.568287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.568347 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.568359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.568371 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.568379 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.669840 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.669879 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.669890 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.669905 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.669915 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.771327 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.771357 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.771365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.771378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.771397 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.873080 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.873110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.873120 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.873131 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.873154 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.975101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.975150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.975162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.975174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:30 crc kubenswrapper[4733]: I1206 05:44:30.975185 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:30Z","lastTransitionTime":"2025-12-06T05:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.076901 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.076925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.076933 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.076944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.076951 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.178703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.178739 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.178748 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.178764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.178772 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.280427 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.280457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.280465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.280478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.280486 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.382578 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.382613 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.382623 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.382636 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.382645 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.484543 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.484576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.484585 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.484598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.484607 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.586540 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.586573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.586582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.586594 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.586603 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.688096 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.688129 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.688137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.688148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.688159 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.790259 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.790290 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.790299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.790327 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.790337 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.892276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.892337 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.892349 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.892365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.892376 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.994670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.994701 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.994709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.994720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:31 crc kubenswrapper[4733]: I1206 05:44:31.994728 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:31Z","lastTransitionTime":"2025-12-06T05:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.096597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.096632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.096641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.096653 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.096661 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.198280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.198320 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.198330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.198339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.198346 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.300415 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.300442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.300450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.300459 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.300467 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.402206 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.402239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.402249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.402262 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.402271 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.484331 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:32 crc kubenswrapper[4733]: E1206 05:44:32.484434 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.484461 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.484485 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:32 crc kubenswrapper[4733]: E1206 05:44:32.484546 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.484485 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:32 crc kubenswrapper[4733]: E1206 05:44:32.484635 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:32 crc kubenswrapper[4733]: E1206 05:44:32.484669 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.503747 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.503784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.503793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.503806 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.503815 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.605962 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.605995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.606006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.606018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.606025 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.707910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.707948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.707959 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.707972 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.707982 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.809763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.809796 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.809807 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.809817 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.809828 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.912218 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.912250 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.912258 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.912269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:32 crc kubenswrapper[4733]: I1206 05:44:32.912279 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:32Z","lastTransitionTime":"2025-12-06T05:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.014086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.014118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.014126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.014138 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.014147 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.116016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.116054 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.116064 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.116077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.116086 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.218169 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.218198 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.218209 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.218219 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.218228 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.320286 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.320325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.320333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.320344 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.320351 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.422535 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.422563 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.422572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.422584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.422593 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.524705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.524740 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.524748 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.524759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.524766 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.626405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.626430 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.626438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.626449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.626456 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.728396 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.728450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.728462 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.728478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.728493 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.830256 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.830291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.830338 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.830351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.830360 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.932088 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.932116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.932125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.932154 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:33 crc kubenswrapper[4733]: I1206 05:44:33.932163 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:33Z","lastTransitionTime":"2025-12-06T05:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.034414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.034441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.034449 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.034459 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.034468 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.135898 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.135928 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.135938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.135952 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.135960 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.237894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.237936 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.237946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.237961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.237971 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.339470 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.339501 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.339511 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.339522 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.339529 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.441516 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.441565 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.441574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.441590 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.441616 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.484407 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.484462 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.484470 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.484536 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:34 crc kubenswrapper[4733]: E1206 05:44:34.484616 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:34 crc kubenswrapper[4733]: E1206 05:44:34.484680 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:34 crc kubenswrapper[4733]: E1206 05:44:34.484922 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:34 crc kubenswrapper[4733]: E1206 05:44:34.485038 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.543389 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.543413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.543422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.543435 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.543443 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.644573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.644607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.644618 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.644632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.644642 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.746463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.746493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.746502 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.746513 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.746522 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.848414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.848445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.848453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.848465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.848474 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.949713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.949743 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.949754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.949767 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:34 crc kubenswrapper[4733]: I1206 05:44:34.949776 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:34Z","lastTransitionTime":"2025-12-06T05:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.050885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.050912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.050919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.050930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.050937 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.152747 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.152775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.152784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.152794 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.152802 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.254584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.254612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.254621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.254630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.254637 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.356330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.356357 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.356365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.356387 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.356394 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.458557 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.458612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.458622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.458639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.458649 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.560456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.560489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.560497 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.560506 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.560514 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.662079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.662137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.662146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.662157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.662165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.763961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.763993 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.764026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.764039 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.764048 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.865641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.865672 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.865681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.865696 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.865708 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.967676 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.967711 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.967719 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.967733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:35 crc kubenswrapper[4733]: I1206 05:44:35.967743 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:35Z","lastTransitionTime":"2025-12-06T05:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.069776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.069824 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.069833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.069844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.069852 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.171672 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.171700 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.171708 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.171720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.171729 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.273216 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.273248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.273258 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.273271 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.273282 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.375039 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.375069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.375084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.375097 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.375106 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.476768 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.476808 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.476817 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.476831 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.476842 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.483735 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.483751 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.483735 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:36 crc kubenswrapper[4733]: E1206 05:44:36.483822 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.483850 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:36 crc kubenswrapper[4733]: E1206 05:44:36.483899 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:36 crc kubenswrapper[4733]: E1206 05:44:36.483960 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:36 crc kubenswrapper[4733]: E1206 05:44:36.484038 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.494761 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.502615 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.513029 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.520250 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.527683 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.534119 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.541774 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.550800 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.558498 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.570527 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578091 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.578670 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.586636 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.594597 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.602458 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.610745 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.617645 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.629722 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:36Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.680189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.680222 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.680230 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.680241 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.680250 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.781704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.781736 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.781744 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.781754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.781761 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.883929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.883973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.883983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.883998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.884015 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.986244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.986279 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.986289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.986319 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:36 crc kubenswrapper[4733]: I1206 05:44:36.986330 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:36Z","lastTransitionTime":"2025-12-06T05:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.088223 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.088258 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.088267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.088278 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.088286 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.190573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.190622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.190632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.190650 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.190661 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.292703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.292952 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.292962 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.292977 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.292991 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.395389 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.395420 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.395429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.395442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.395451 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.496895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.496927 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.496936 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.496951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.496959 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.598996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.599026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.599036 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.599047 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.599055 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.701405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.701433 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.701441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.701453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.701460 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.802857 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.802900 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.802911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.802926 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.802935 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.905192 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.905223 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.905231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.905244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:37 crc kubenswrapper[4733]: I1206 05:44:37.905253 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:37Z","lastTransitionTime":"2025-12-06T05:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.007355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.007403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.007412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.007425 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.007435 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.109573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.109610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.109620 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.109634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.109643 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.211509 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.211536 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.211544 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.211554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.211561 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.264891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.264923 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.264931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.264944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.264954 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.273715 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:38Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.276539 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.276572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.276580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.276593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.276602 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.284639 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:38Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.286947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.286973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.286981 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.286990 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.287000 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.295694 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:38Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.298021 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.298049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.298057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.298070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.298078 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.305859 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:38Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.307819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.307845 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.307854 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.307865 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.307872 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.315681 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:38Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.315800 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.316928 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.316955 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.316963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.316976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.316986 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.418953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.418987 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.418996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.419006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.419014 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.483966 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.484075 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.484167 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.484240 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.484335 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.484401 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.484435 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:38 crc kubenswrapper[4733]: E1206 05:44:38.484708 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.520909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.520941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.520952 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.520966 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.520974 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.623293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.623346 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.623367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.623382 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.623391 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.725795 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.725844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.725858 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.725883 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.725896 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.828010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.828054 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.828065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.828079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.828091 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.929549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.929575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.929583 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.929596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:38 crc kubenswrapper[4733]: I1206 05:44:38.929606 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:38Z","lastTransitionTime":"2025-12-06T05:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.031430 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.031479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.031489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.031507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.031524 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.134827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.134888 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.134899 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.134911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.134921 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.237241 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.237283 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.237295 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.237368 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.237382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.339018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.339065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.339074 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.339088 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.339096 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.441397 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.441431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.441443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.441458 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.441478 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.544040 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.544090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.544101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.544126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.544137 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.647097 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.647135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.647146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.647158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.647169 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.749781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.749818 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.749830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.749849 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.749860 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.851698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.851726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.851735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.851748 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.851757 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.953212 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.953252 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.953261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.953276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:39 crc kubenswrapper[4733]: I1206 05:44:39.953289 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:39Z","lastTransitionTime":"2025-12-06T05:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.054867 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.054889 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.054897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.054907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.054918 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.156575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.156607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.156615 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.156626 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.156634 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.258170 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.258203 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.258213 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.258227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.258236 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.359898 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.360006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.360081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.360163 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.360222 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.461938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.461970 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.461978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.461992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.462003 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.483741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.483811 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.483863 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.483914 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.483930 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.483954 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:45:12.483939213 +0000 UTC m=+96.349150324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.483986 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.483995 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.484043 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.484139 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:40 crc kubenswrapper[4733]: E1206 05:44:40.484194 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.564260 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.564287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.564294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.564324 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.564334 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.666754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.666785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.666793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.666855 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.666866 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.769938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.769976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.769986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.770005 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.770017 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.871681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.871709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.871718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.871729 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.871737 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.974511 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.974546 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.974555 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.974569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:40 crc kubenswrapper[4733]: I1206 05:44:40.974578 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:40Z","lastTransitionTime":"2025-12-06T05:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.076209 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.076257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.076268 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.076294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.076474 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.178715 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.178749 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.178759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.178771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.178782 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.280919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.280996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.281013 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.281035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.281051 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.383209 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.383240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.383251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.383267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.383276 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.485843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.485873 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.485888 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.485899 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.485908 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.588144 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.588189 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.588201 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.588215 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.588232 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.689955 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.689986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.689994 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.690007 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.690016 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791329 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791396 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791918 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/0.log" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791962 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc59542d-ee4a-414d-b096-86716cb56db5" containerID="d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3" exitCode=1 Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.791986 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerDied","Data":"d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.792264 4733 scope.go:117] "RemoveContainer" containerID="d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.818471 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.830851 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.842829 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.854016 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.862374 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.869974 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.878717 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.891492 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.893259 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.893284 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.893295 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.893325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.893337 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.901647 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.915031 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.927992 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.936850 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.946023 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.955530 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.962919 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.971134 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.978633 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:41Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.995079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.995106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.995114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.995126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:41 crc kubenswrapper[4733]: I1206 05:44:41.995137 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:41Z","lastTransitionTime":"2025-12-06T05:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.097239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.097274 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.097285 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.097322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.097336 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.199416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.199491 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.199503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.199525 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.199540 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.302066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.302113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.302124 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.302145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.302161 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.404839 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.404884 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.404894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.404914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.404926 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.484759 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.484786 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.484915 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:42 crc kubenswrapper[4733]: E1206 05:44:42.485041 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.485197 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:42 crc kubenswrapper[4733]: E1206 05:44:42.485272 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:42 crc kubenswrapper[4733]: E1206 05:44:42.485432 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:42 crc kubenswrapper[4733]: E1206 05:44:42.485525 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.506894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.506954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.506967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.506981 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.506994 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.608774 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.608815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.608824 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.608841 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.608853 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.710706 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.710745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.710755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.710766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.710775 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.796610 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/0.log" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.796701 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerStarted","Data":"238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.807451 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.812571 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.812684 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.812751 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.812815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.812878 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.819358 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.829125 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.837283 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.846385 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.856134 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.866710 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.877177 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.886168 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.894564 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.903293 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.913966 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.915466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.915515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.915524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.915542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.915561 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:42Z","lastTransitionTime":"2025-12-06T05:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.925281 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.934738 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.944971 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.954016 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:42 crc kubenswrapper[4733]: I1206 05:44:42.971220 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:42Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.018206 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.018258 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.018271 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.018296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.018337 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.119913 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.119952 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.119961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.119979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.119990 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.221583 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.221624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.221633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.221646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.221658 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.323587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.323646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.323656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.323673 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.323689 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.425949 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.426003 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.426016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.426037 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.426050 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.528556 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.528742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.528818 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.528910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.528986 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.631470 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.631523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.631533 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.631552 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.631566 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.733729 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.733785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.733799 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.733820 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.733832 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.835537 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.835574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.835585 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.835596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.835608 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.938414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.938662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.938674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.938690 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:43 crc kubenswrapper[4733]: I1206 05:44:43.938702 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:43Z","lastTransitionTime":"2025-12-06T05:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.040971 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.041089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.041154 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.041216 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.041290 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.142843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.143439 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.143467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.143493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.143506 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.245841 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.245879 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.245890 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.245909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.245920 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.347873 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.347907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.347915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.347932 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.347944 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.449625 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.449665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.449678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.449694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.449703 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.484380 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.484456 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:44 crc kubenswrapper[4733]: E1206 05:44:44.484508 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.484577 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:44 crc kubenswrapper[4733]: E1206 05:44:44.484687 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.484716 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:44 crc kubenswrapper[4733]: E1206 05:44:44.485002 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:44 crc kubenswrapper[4733]: E1206 05:44:44.485079 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.485519 4733 scope.go:117] "RemoveContainer" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.552298 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.552600 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.552615 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.552630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.552639 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.655099 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.655134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.655145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.655165 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.655180 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.756995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.757033 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.757042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.757055 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.757064 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.807479 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/2.log" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.810717 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.811248 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.834724 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858269 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858725 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858765 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.858789 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.870109 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.879408 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.887957 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.900785 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.914769 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.925264 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.937971 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.947924 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.958274 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.961070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.961109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.961118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.961133 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.961145 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:44Z","lastTransitionTime":"2025-12-06T05:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.971121 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.982399 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:44 crc kubenswrapper[4733]: I1206 05:44:44.992804 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.001207 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:44Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.009841 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.018627 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.063697 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.063741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.063753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.063770 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.063781 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.166457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.166507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.166519 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.166542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.166557 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.268244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.268287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.268315 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.268336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.268376 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.370735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.370776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.370787 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.370806 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.370816 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.472540 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.472582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.472595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.472613 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.472627 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.574432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.574463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.574472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.574485 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.574493 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.677370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.677416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.677429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.677452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.677466 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.780668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.780701 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.780711 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.780728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.780740 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.816056 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/3.log" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.816635 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/2.log" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.819524 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" exitCode=1 Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.819575 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.819632 4733 scope.go:117] "RemoveContainer" containerID="19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.820094 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:44:45 crc kubenswrapper[4733]: E1206 05:44:45.820268 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.836730 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:45Z\\\",\\\"message\\\":\\\"-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 05:44:45.231733 6787 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1206 05:44:45.231791 6787 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.848731 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.859101 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.867405 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.874976 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.883408 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.883450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.883465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.883483 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.883498 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.885372 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.896957 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.909195 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.919691 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.930186 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.941283 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.949887 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.958225 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.966789 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.975027 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.986885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.987009 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.987070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.987134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.987191 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:45Z","lastTransitionTime":"2025-12-06T05:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.987293 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:45 crc kubenswrapper[4733]: I1206 05:44:45.996411 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:45Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.089608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.089776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.089839 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.089904 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.090159 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.196914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.196953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.196963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.196983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.196997 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.299554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.299591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.299601 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.299618 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.299632 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.402220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.402260 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.402273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.402291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.402323 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.483827 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.483848 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.483881 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:46 crc kubenswrapper[4733]: E1206 05:44:46.484009 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.484049 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:46 crc kubenswrapper[4733]: E1206 05:44:46.484353 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:46 crc kubenswrapper[4733]: E1206 05:44:46.484286 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:46 crc kubenswrapper[4733]: E1206 05:44:46.484615 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.499296 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.504634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.504670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.504681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.504698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.504717 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.510198 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.520245 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.530138 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.539791 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.551594 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.566540 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.576709 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.586032 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.594592 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.602751 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.605934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.605960 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.605970 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.605987 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.605999 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.613096 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.622714 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.630968 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.645661 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ed6e23b5df7eda4e1271f8c8ff0b9202270a73a4aa074b3625fcbc0114470c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:16Z\\\",\\\"message\\\":\\\"try/node-ca-pqsfd\\\\nI1206 05:44:16.869839 6419 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1206 05:44:16.869841 6419 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-pqsfd\\\\nI1206 05:44:16.869707 6419 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075d864b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:45Z\\\",\\\"message\\\":\\\"-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 05:44:45.231733 6787 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1206 05:44:45.231791 6787 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.655443 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.664888 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.708268 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.708320 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.708347 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.708366 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.708377 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.810763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.810795 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.810806 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.810821 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.810832 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.823903 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/3.log" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.827381 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:44:46 crc kubenswrapper[4733]: E1206 05:44:46.827570 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.840546 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.853070 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.863215 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.872365 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.881115 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.894859 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:45Z\\\",\\\"message\\\":\\\"-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 05:44:45.231733 6787 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1206 05:44:45.231791 6787 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.904257 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.912798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.912931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.912995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.913062 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.913119 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:46Z","lastTransitionTime":"2025-12-06T05:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.913905 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.924426 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.933035 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.943977 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.955151 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.964872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.973548 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.981073 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.988646 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:46 crc kubenswrapper[4733]: I1206 05:44:46.997598 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:46Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.015954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.016055 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.016149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.016217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.016321 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.119351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.119391 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.119402 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.119417 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.119428 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.221140 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.221172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.221182 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.221196 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.221207 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.322766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.322802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.322815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.322832 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.322843 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.424764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.424802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.424812 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.424827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.424838 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.527415 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.527447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.527455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.527468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.527476 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.629179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.629229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.629239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.629254 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.629267 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.730897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.730941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.730951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.730964 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.730974 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.832747 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.832783 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.832796 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.832810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.832820 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.935078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.935116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.935127 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.935139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:47 crc kubenswrapper[4733]: I1206 05:44:47.935147 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:47Z","lastTransitionTime":"2025-12-06T05:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.037575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.037617 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.037628 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.037644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.037655 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.140193 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.140257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.140269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.140289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.140321 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.242558 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.242586 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.242596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.242614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.242623 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.344920 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.344954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.344964 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.344979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.344988 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.447171 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.447195 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.447203 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.447217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.447226 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.484227 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.484248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.484351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.484442 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.484391 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.484576 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.484679 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.484355 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.548989 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.549057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.549072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.549097 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.549112 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.589023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.589059 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.589070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.589080 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.589091 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.599720 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:48Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.602775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.602804 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.602815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.602827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.602836 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.611783 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:48Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.614652 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.614688 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.614698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.614711 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.614719 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.624174 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:48Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.626829 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.626864 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.626877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.626892 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.626900 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.636785 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:48Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.639356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.639389 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.639403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.639416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.639427 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.648991 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:48Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:48 crc kubenswrapper[4733]: E1206 05:44:48.649103 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.651010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.651035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.651045 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.651057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.651065 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.753230 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.753283 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.753296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.753357 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.753369 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.855532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.855568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.855579 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.855595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.855606 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.958123 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.958164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.958175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.958193 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:48 crc kubenswrapper[4733]: I1206 05:44:48.958206 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:48Z","lastTransitionTime":"2025-12-06T05:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.059735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.059779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.059792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.059808 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.059820 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.161811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.161881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.161896 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.161937 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.161952 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.264457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.264495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.264505 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.264519 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.264529 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.366641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.366681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.366691 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.366707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.366718 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.468830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.468899 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.468910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.468929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.468942 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.571080 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.571127 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.571139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.571156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.571165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.672763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.672799 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.672808 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.672823 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.672837 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.774961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.774987 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.774996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.775008 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.775020 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.876895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.876935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.876945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.876978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.876992 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.979098 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.979156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.979168 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.979178 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:49 crc kubenswrapper[4733]: I1206 05:44:49.979187 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:49Z","lastTransitionTime":"2025-12-06T05:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.081087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.081123 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.081134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.081148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.081160 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.183146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.183199 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.183212 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.183224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.183231 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.284985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.285048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.285062 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.285074 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.285082 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.387130 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.387173 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.387183 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.387198 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.387208 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.484388 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.484446 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.484446 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:50 crc kubenswrapper[4733]: E1206 05:44:50.484554 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.484618 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:50 crc kubenswrapper[4733]: E1206 05:44:50.485350 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:50 crc kubenswrapper[4733]: E1206 05:44:50.485598 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:50 crc kubenswrapper[4733]: E1206 05:44:50.485919 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.489764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.489805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.489817 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.489833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.489851 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.592147 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.592187 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.592198 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.592213 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.592222 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.694156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.694191 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.694205 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.694217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.694226 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.796738 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.796785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.796795 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.796815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.796825 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.899023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.899075 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.899087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.899110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:50 crc kubenswrapper[4733]: I1206 05:44:50.899124 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:50Z","lastTransitionTime":"2025-12-06T05:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.000812 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.000853 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.000863 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.000877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.000887 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.102956 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.102987 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.102998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.103011 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.103018 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.204991 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.205035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.205052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.205064 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.205074 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.307072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.307105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.307115 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.307126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.307136 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.408658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.408723 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.408733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.408748 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.408758 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.511231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.511258 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.511267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.511276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.511284 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.613022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.613046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.613055 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.613065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.613073 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.715101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.715128 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.715137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.715146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.715155 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.817649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.817691 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.817702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.817722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.817733 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.920242 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.920271 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.920282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.920294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:51 crc kubenswrapper[4733]: I1206 05:44:51.920335 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:51Z","lastTransitionTime":"2025-12-06T05:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.022654 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.022696 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.022707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.022720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.022732 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.124687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.124761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.124773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.124791 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.124808 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.226860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.226898 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.226910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.226925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.226939 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.329105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.329136 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.329146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.329156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.329165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.431506 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.431563 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.431578 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.431609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.431620 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.484270 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.484390 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:52 crc kubenswrapper[4733]: E1206 05:44:52.484424 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.484476 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:52 crc kubenswrapper[4733]: E1206 05:44:52.484715 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.484554 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:52 crc kubenswrapper[4733]: E1206 05:44:52.484865 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:52 crc kubenswrapper[4733]: E1206 05:44:52.484905 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.533206 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.533245 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.533254 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.533270 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.533280 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.635219 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.635246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.635255 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.635267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.635279 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.737554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.737585 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.737596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.737638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.737648 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.839892 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.839919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.839929 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.839942 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.839950 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.942187 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.942226 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.942238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.942252 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:52 crc kubenswrapper[4733]: I1206 05:44:52.942262 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:52Z","lastTransitionTime":"2025-12-06T05:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.044370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.044409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.044419 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.044434 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.044448 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.146894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.146932 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.146940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.146953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.146964 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.249191 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.249224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.249235 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.249248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.249260 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.351489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.351521 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.351531 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.351545 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.351555 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.454049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.454078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.454090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.454106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.454116 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.557373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.557405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.557416 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.557428 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.557437 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.659194 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.659237 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.659251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.659264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.659273 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.761707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.761745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.761757 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.761776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.761787 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.864067 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.864094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.864106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.864118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.864127 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.966433 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.966476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.966487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.966505 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:53 crc kubenswrapper[4733]: I1206 05:44:53.966513 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:53Z","lastTransitionTime":"2025-12-06T05:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.068615 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.068647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.068658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.068678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.068688 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.170169 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.170217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.170229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.170242 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.170252 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.272576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.272629 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.272639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.272663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.272676 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.374602 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.374646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.374662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.374674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.374682 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.476584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.476631 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.476641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.476655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.476669 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.484105 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.484154 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.484164 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:54 crc kubenswrapper[4733]: E1206 05:44:54.484266 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.484385 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:54 crc kubenswrapper[4733]: E1206 05:44:54.484488 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:54 crc kubenswrapper[4733]: E1206 05:44:54.484580 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:54 crc kubenswrapper[4733]: E1206 05:44:54.484727 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.578207 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.578249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.578261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.578273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.578286 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.680025 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.680051 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.680060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.680069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.680080 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.783378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.783408 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.783418 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.783428 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.783439 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.885908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.885944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.885954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.885966 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.885977 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.988227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.988263 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.988274 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.988288 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:54 crc kubenswrapper[4733]: I1206 05:44:54.988323 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:54Z","lastTransitionTime":"2025-12-06T05:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.090283 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.090344 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.090354 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.090371 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.090382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.192290 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.192342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.192352 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.192365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.192374 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.294356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.294380 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.294390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.294405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.294416 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.396197 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.396224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.396233 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.396242 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.396250 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.497525 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.497551 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.497560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.497572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.497579 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.599770 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.599805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.599815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.599830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.599840 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.701609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.701648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.701659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.701673 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.701687 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.804024 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.804065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.804076 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.804094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.804104 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.905716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.905759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.905773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.905786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:55 crc kubenswrapper[4733]: I1206 05:44:55.905795 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:55Z","lastTransitionTime":"2025-12-06T05:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.007730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.007777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.007788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.007802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.007814 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.109609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.109646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.109656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.109671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.109681 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.211351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.211402 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.211431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.211446 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.211456 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.313758 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.313793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.313804 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.313817 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.313828 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.415597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.415968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.416058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.416157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.416229 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.484080 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:56 crc kubenswrapper[4733]: E1206 05:44:56.484464 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.484506 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.484561 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.484582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:56 crc kubenswrapper[4733]: E1206 05:44:56.484684 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:56 crc kubenswrapper[4733]: E1206 05:44:56.484819 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:56 crc kubenswrapper[4733]: E1206 05:44:56.484878 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.498872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.508784 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518348 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518645 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518691 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518706 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.518718 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.528002 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.535913 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.543044 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.550775 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.560356 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.568789 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.577521 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.586099 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.593712 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.606202 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:45Z\\\",\\\"message\\\":\\\"-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 05:44:45.231733 6787 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1206 05:44:45.231791 6787 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.615996 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.620788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.620890 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.620946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.620998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.621055 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.624443 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.634407 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.643917 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:56Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.722793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.722828 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.722841 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.722857 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.722868 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.824353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.824385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.824414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.824430 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.824441 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.926144 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.926499 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.926508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.926521 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:56 crc kubenswrapper[4733]: I1206 05:44:56.926532 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:56Z","lastTransitionTime":"2025-12-06T05:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.028593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.028651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.028663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.028681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.028696 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.133968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.134007 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.134020 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.134037 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.134054 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.236126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.236174 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.236184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.236201 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.236213 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.338455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.338518 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.338529 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.338554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.338566 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.440090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.440151 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.440162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.440181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.440193 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.542358 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.542393 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.542401 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.542412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.542424 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.644118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.644150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.644162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.644175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.644184 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.746019 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.746052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.746059 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.746068 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.746076 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.847872 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.847902 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.847911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.847925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.847935 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.949879 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.949934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.949943 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.949953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:57 crc kubenswrapper[4733]: I1206 05:44:57.949964 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:57Z","lastTransitionTime":"2025-12-06T05:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.051848 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.051874 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.051881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.051892 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.051903 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.153713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.153754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.153764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.153781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.153792 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.256527 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.256563 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.256573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.256589 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.256601 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.333490 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.333628 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.333677 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:46:02.333655043 +0000 UTC m=+146.198866155 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.333716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.333739 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.333782 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:46:02.333768977 +0000 UTC m=+146.198980088 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.333799 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.333831 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 05:46:02.333824572 +0000 UTC m=+146.199035682 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.358772 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.358801 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.358810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.358827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.358837 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.434810 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.434847 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434938 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434953 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434958 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434983 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434993 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.435034 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 05:46:02.435020266 +0000 UTC m=+146.300231377 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.434964 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.435072 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 05:46:02.435063748 +0000 UTC m=+146.300274859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.460561 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.460588 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.460638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.460656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.460664 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.484016 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.484060 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.484106 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.484117 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.484191 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.484219 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.484591 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.484676 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.485046 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.485345 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.562610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.562635 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.562644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.562659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.562670 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.664916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.664975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.664986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.664998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.665006 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.717922 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.717945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.717953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.717964 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.717972 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.728355 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.734651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.734723 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.734749 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.734767 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.734786 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.744508 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.748340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.748372 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.748381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.748394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.748403 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.757412 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.760374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.760428 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.760441 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.760459 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.760471 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.771512 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.774844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.774881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.774890 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.774907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.774918 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.783146 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:44:58Z is after 2025-08-24T17:21:41Z" Dec 06 05:44:58 crc kubenswrapper[4733]: E1206 05:44:58.783256 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.784332 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.784360 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.784370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.784382 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.784389 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.886668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.886731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.886743 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.886755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.886763 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.988935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.989433 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.989521 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.989591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:58 crc kubenswrapper[4733]: I1206 05:44:58.989659 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:58Z","lastTransitionTime":"2025-12-06T05:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.091606 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.091643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.091656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.091672 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.091680 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.193726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.193764 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.193773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.193802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.193811 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.295754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.295797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.295807 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.295822 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.295838 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.397632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.397668 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.397680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.397692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.397702 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.499563 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.499584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.499594 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.499606 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.499616 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.601762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.601916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.601978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.602049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.602103 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.703666 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.703718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.703728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.703752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.703766 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.806246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.806322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.806333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.806348 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.806356 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.907790 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.907842 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.907855 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.907875 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:44:59 crc kubenswrapper[4733]: I1206 05:44:59.907889 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:44:59Z","lastTransitionTime":"2025-12-06T05:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.009888 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.009925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.009935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.009947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.009960 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.112437 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.112469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.112482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.112493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.112502 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.214272 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.214339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.214351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.214364 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.214373 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.316250 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.316293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.316323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.316335 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.316345 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.418065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.418119 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.418129 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.418149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.418166 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.484746 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.484800 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:00 crc kubenswrapper[4733]: E1206 05:45:00.484919 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.484769 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:00 crc kubenswrapper[4733]: E1206 05:45:00.485045 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.485109 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:00 crc kubenswrapper[4733]: E1206 05:45:00.485152 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:00 crc kubenswrapper[4733]: E1206 05:45:00.485286 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.495738 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.519871 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.519910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.519921 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.519934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.519946 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.624029 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.624072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.624087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.624113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.624125 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.726405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.726445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.726461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.726479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.726488 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.829245 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.829296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.829326 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.829340 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.829351 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.931671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.931730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.931742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.931761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:00 crc kubenswrapper[4733]: I1206 05:45:00.931774 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:00Z","lastTransitionTime":"2025-12-06T05:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.034402 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.034474 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.034486 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.034504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.034514 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.136518 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.136548 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.136560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.136575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.136584 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.238973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.239018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.239032 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.239049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.239062 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.341195 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.341299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.341383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.341440 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.341525 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.443648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.443693 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.443704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.443717 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.443727 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.545220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.545256 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.545265 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.545283 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.545293 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.647197 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.647229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.647238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.647251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.647260 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.749251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.749335 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.749347 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.749361 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.749372 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.851935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.851980 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.851992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.852008 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.852019 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.953624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.953693 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.953724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.953750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:01 crc kubenswrapper[4733]: I1206 05:45:01.953768 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:01Z","lastTransitionTime":"2025-12-06T05:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.055573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.055616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.055637 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.055655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.055670 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.157444 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.157482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.157491 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.157504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.157515 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.259650 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.259701 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.259718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.259741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.259755 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.361742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.361797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.361810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.361830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.361839 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.463692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.463755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.463765 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.463796 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.463804 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.484518 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.484536 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:02 crc kubenswrapper[4733]: E1206 05:45:02.484631 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.484645 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.484675 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:02 crc kubenswrapper[4733]: E1206 05:45:02.484741 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:02 crc kubenswrapper[4733]: E1206 05:45:02.484799 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:02 crc kubenswrapper[4733]: E1206 05:45:02.484860 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.565698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.565722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.565731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.565750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.565760 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.668050 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.668073 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.668083 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.668093 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.668101 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.770331 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.770358 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.770368 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.770379 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.770387 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.871938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.871985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.871995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.872011 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.872022 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.973591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.973642 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.973653 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.973665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:02 crc kubenswrapper[4733]: I1206 05:45:02.973673 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:02Z","lastTransitionTime":"2025-12-06T05:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.075569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.075617 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.075628 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.075644 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.075656 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.177793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.177826 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.177837 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.177849 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.177860 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.279514 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.279546 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.279556 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.279568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.279575 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.381560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.381606 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.381619 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.381634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.381646 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.483155 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.483183 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.483192 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.483202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.483210 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.585191 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.585227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.585238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.585253 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.585271 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.687194 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.687254 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.687273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.687289 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.687298 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.789102 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.789139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.789150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.789164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.789175 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.891195 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.891221 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.891229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.891240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.891249 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.993530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.993566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.993574 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.993590 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:03 crc kubenswrapper[4733]: I1206 05:45:03.993600 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:03Z","lastTransitionTime":"2025-12-06T05:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.095423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.095445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.095452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.095461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.095471 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.197266 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.197290 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.197299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.197325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.197334 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.299561 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.299601 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.299610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.299634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.299644 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.401341 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.401414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.401426 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.401438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.401447 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.484287 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.484351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:04 crc kubenswrapper[4733]: E1206 05:45:04.484421 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.484299 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.484501 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:04 crc kubenswrapper[4733]: E1206 05:45:04.484538 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:04 crc kubenswrapper[4733]: E1206 05:45:04.484579 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:04 crc kubenswrapper[4733]: E1206 05:45:04.484666 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.503448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.503472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.503482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.503492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.503502 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.605452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.605482 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.605492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.605502 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.605511 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.707199 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.707267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.707280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.707294 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.707322 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.809278 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.809337 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.809349 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.809363 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.809375 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.911244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.911286 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.911296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.911336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:04 crc kubenswrapper[4733]: I1206 05:45:04.911346 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:04Z","lastTransitionTime":"2025-12-06T05:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.013460 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.013490 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.013502 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.013523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.013533 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.115132 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.115167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.115176 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.115190 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.115200 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.216496 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.216523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.216532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.216542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.216549 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.318409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.318443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.318454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.318466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.318475 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.420179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.420214 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.420226 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.420243 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.420270 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.521573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.521597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.521605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.521616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.521626 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.623523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.623557 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.623565 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.623576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.623584 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.725648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.725684 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.725694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.725704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.725711 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.827451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.827484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.827493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.827503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.827512 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.929447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.929493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.929504 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.929515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:05 crc kubenswrapper[4733]: I1206 05:45:05.929524 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:05Z","lastTransitionTime":"2025-12-06T05:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.031406 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.031437 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.031447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.031460 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.031469 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.133664 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.133716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.133727 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.133739 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.133748 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.235710 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.235745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.235753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.235766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.235776 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.337208 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.337231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.337240 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.337263 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.337270 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.439231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.439265 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.439274 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.439287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.439295 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.483989 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.484025 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:06 crc kubenswrapper[4733]: E1206 05:45:06.484112 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.484230 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:06 crc kubenswrapper[4733]: E1206 05:45:06.484342 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:06 crc kubenswrapper[4733]: E1206 05:45:06.484465 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.484778 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:06 crc kubenswrapper[4733]: E1206 05:45:06.485127 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.497794 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.509350 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d7ccbf-e88d-4045-8d89-633470de7aca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dffbae27a10ae2e00933637da0e30fc5b8574f2ee8edb5b4b09c37a2d05e980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2609f7ad60b4f90d844d4f4d8573587826cbdf4c0b76f6b8a1b5cddec86ad7d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93ef7c618da4d94a4956f082f96b9be994042458ff524e9e1172f526a4135e1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c91a8199b1f8ede480f2bd92335fe3c8dc0d0e11caa2cf3bd213c234d0779f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047dc4e7f8f30d1f9cf824ee4059c99c07cd9f29bd985e0e00ac22febb297f1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadff80e27f4e0103110c153c52936b931bfd70ca4363a3caa44ec4f746d01dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16e4034c91b0b19898468eccdc22e059ad7e830ef9e4ff0bea88d447f6a09c64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5mf9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.519172 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e24a9e84-0151-4204-9391-510da9049b58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae69842996fcf4d62a14e1cc73b68f2326287d0fa75d4587acb47862b1d40bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a1c3268a5ca5c4c35865c8ff8f700686db8f5c2889152aabe27a36b1ccd9082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sh6vd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q2ktk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.528728 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c145932d-56db-49da-ab40-1f9faeaa004e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a89503b511d9f2da9fb5e41e1adb5f5c60e14909aebd4495baafc709177fa56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382d71a067b68d67891c063f0a4c833b7433e15db0e05b36e46f24bbbb1626ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b838411bb65919138a421cd17775561b7764a006894daa8f2bed711287c1914\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.537101 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fedfcd9f-fa28-4efb-9677-e24a6dae9c04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a98fd30a5052ebe2872dd5e1c7f44e9ed9019ad8662a687a9a9a39acce3627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e214c308f89a818305483c9dc2980b09c41c963bd5df5c91d56a1f8e47dd8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a658a854294c1c7b43ab8c1bd56969065a6c630a68b2c39366fd243ebd7af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7acc4267cfa0a489d59bdc4c37f12356e6a053e6cd477a87a38816bf71539ce1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.541569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.541623 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.541633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.541646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.541656 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.548676 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e8d7d591deb47598776511be462724fabc5543e82b6a74edfc29fb01ccb977\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.557968 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77d63bf154094eece4d04d42186bc7f957f0b1ab0315c496bb8a785269184ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdccf2a58baf2a39276908ed60c86219657d8780a50630c20be6f8bc4c256fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.567555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa17b3f3dd91488b73e0e7f3101c5e9932dd0c1573946bbd91819f1ec51202e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.574343 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqsfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25abcf60-fe34-446b-9df8-1ed8e5102975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://163c90ba7e6470fb31049cd650d1384d35d87b94a9193184bfe3ea16feddf307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gb5ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqsfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.581385 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnxdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5c4ca7-33ee-4858-948f-631753eb056e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4a50e7cb4197e088c193a3bedc8acb2720a885e588e56051fbfa1e102099e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrbr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnxdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.589991 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"783a23b4-976c-4a47-9551-f5f6e8e28c4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d78f49702fa45a467671f74e635a61e2d56cd857c2844f685e12bb1e00e70a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d06b0816bbcb1c63a0598cd8cc1175582cb8072f620edac05a2f115fd7f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5d06b0816bbcb1c63a0598cd8cc1175582cb8072f620edac05a2f115fd7f75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.597399 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8fw28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47p7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:44:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8fw28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.605888 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.615459 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0700e329-54b6-4cfe-b2de-5cee58cf1aa5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 05:43:48.722254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 05:43:48.730728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3849141372/tls.crt::/tmp/serving-cert-3849141372/tls.key\\\\\\\"\\\\nI1206 05:43:54.083506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 05:43:54.085960 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 05:43:54.085979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 05:43:54.086001 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 05:43:54.086006 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 05:43:54.089093 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 05:43:54.089162 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089190 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 05:43:54.089211 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 05:43:54.089229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 05:43:54.089245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 05:43:54.089261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 05:43:54.089103 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 05:43:54.090706 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.624382 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-684r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc59542d-ee4a-414d-b096-86716cb56db5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:41Z\\\",\\\"message\\\":\\\"2025-12-06T05:43:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5\\\\n2025-12-06T05:43:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16cfcfaf-0a8d-404d-bd9a-d650725684e5 to /host/opt/cni/bin/\\\\n2025-12-06T05:43:56Z [verbose] multus-daemon started\\\\n2025-12-06T05:43:56Z [verbose] Readiness Indicator file check\\\\n2025-12-06T05:44:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbfjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-684r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.633203 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77ebef5bd728c37a6b74ab523c480048959280fdfc9afd8c60b2aca9cd05336d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq86l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g7qjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.643603 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.643643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.643656 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.643677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.643687 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.646036 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"171aa174-9338-4421-8393-9e23fbab7f1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T05:44:45Z\\\",\\\"message\\\":\\\"-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 05:44:45.231733 6787 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1206 05:44:45.231791 6787 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T05:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T05:43:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T05:43:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T05:43:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c5fg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T05:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2gb79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.655472 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T05:43:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:06Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.745737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.745772 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.745782 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.745794 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.745804 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.847702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.847737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.847747 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.847763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.847776 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.949332 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.949452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.949523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.949589 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:06 crc kubenswrapper[4733]: I1206 05:45:06.949660 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:06Z","lastTransitionTime":"2025-12-06T05:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.051814 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.051925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.051995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.052059 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.052124 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.154671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.154698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.154707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.154737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.154748 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.256750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.256775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.256784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.256801 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.256810 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.359010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.359040 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.359049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.359060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.359069 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.461292 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.461342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.461351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.461361 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.461370 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.562852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.562873 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.562882 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.562891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.562899 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.665273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.665297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.665342 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.665353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.665363 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.766652 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.766687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.766699 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.766710 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.766719 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.868222 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.868259 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.868268 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.868280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.868290 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.969832 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.969860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.969869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.969883 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:07 crc kubenswrapper[4733]: I1206 05:45:07.969895 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:07Z","lastTransitionTime":"2025-12-06T05:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.071515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.071544 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.071554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.071567 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.071575 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.172851 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.172897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.172907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.172918 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.172927 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.274745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.274775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.274785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.274797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.274806 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.376655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.376689 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.376699 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.376710 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.376717 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.478689 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.478734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.478745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.478756 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.478764 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.483966 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.483976 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.484083 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.484230 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:08 crc kubenswrapper[4733]: E1206 05:45:08.484329 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:08 crc kubenswrapper[4733]: E1206 05:45:08.484455 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:08 crc kubenswrapper[4733]: E1206 05:45:08.484480 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:08 crc kubenswrapper[4733]: E1206 05:45:08.484570 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.580880 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.580908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.580918 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.580946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.580956 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.683281 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.683670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.683778 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.683850 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.683937 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.786753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.786850 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.786914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.786975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.787033 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.888842 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.888987 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.889074 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.889150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.889246 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.990765 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.990792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.990802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.990813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:08 crc kubenswrapper[4733]: I1206 05:45:08.990821 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:08Z","lastTransitionTime":"2025-12-06T05:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.023597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.023621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.023630 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.023640 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.023648 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.034073 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:09Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.037181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.037296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.037382 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.037454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.037516 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.047451 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:09Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.051637 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.051724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.051740 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.051765 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.051790 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.063326 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:09Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.065849 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.065951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.066027 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.066090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.066154 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.075285 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:09Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.078071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.078123 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.078134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.078148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.078158 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.086933 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T05:45:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6951a1f4-5aff-463d-98ee-6da28494341b\\\",\\\"systemUUID\\\":\\\"4b0d62b0-e895-479e-b261-2bd12b349187\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T05:45:09Z is after 2025-08-24T17:21:41Z" Dec 06 05:45:09 crc kubenswrapper[4733]: E1206 05:45:09.087066 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.092187 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.092292 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.092387 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.092453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.092517 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.194702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.194753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.194766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.194784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.194797 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.296470 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.296509 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.296540 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.296554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.296562 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.399187 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.399227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.399245 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.399263 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.399278 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.500673 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.500714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.500722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.500734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.500743 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.602938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.602969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.602978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.603004 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.603011 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.704752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.704789 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.704801 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.704813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.704822 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.809204 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.809239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.809248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.809261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.809270 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.911674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.911696 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.911708 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.911719 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:09 crc kubenswrapper[4733]: I1206 05:45:09.911729 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:09Z","lastTransitionTime":"2025-12-06T05:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.013916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.013954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.013965 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.013979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.013988 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.116209 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.116276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.116287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.116339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.116357 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.219333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.219371 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.219381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.219398 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.219408 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.321425 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.321461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.321472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.321487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.321499 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.423385 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.423426 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.423438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.423451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.423460 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.484400 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.484544 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.484564 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:10 crc kubenswrapper[4733]: E1206 05:45:10.484632 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.484661 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:10 crc kubenswrapper[4733]: E1206 05:45:10.484721 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:10 crc kubenswrapper[4733]: E1206 05:45:10.484774 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:10 crc kubenswrapper[4733]: E1206 05:45:10.484827 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.527136 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.527267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.527356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.527428 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.527484 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.630101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.630210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.630325 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.630400 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.630469 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.732528 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.732559 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.732568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.732581 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.732591 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.834719 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.835000 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.835070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.835135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.835205 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.937008 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.937042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.937050 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.937061 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:10 crc kubenswrapper[4733]: I1206 05:45:10.937069 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:10Z","lastTransitionTime":"2025-12-06T05:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.038732 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.038779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.038789 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.038799 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.038808 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.140393 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.140443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.140451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.140461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.140467 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.242138 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.242167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.242175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.242202 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.242209 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.343711 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.343752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.343763 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.343779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.343792 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.446164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.446190 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.446198 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.446210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.446220 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.485175 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:45:11 crc kubenswrapper[4733]: E1206 05:45:11.485344 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.548156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.548204 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.548214 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.548239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.548248 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.649570 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.649602 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.649612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.649622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.649632 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.751329 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.751356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.751364 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.751374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.751382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.853004 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.853044 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.853053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.853065 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.853074 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.954951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.954978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.954986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.954995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:11 crc kubenswrapper[4733]: I1206 05:45:11.955004 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:11Z","lastTransitionTime":"2025-12-06T05:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.056744 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.056790 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.056802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.056821 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.056836 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.158996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.159053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.159062 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.159082 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.159092 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.260659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.260694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.260703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.260716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.260725 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.362211 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.362246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.362257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.362267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.362274 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.464538 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.464570 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.464578 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.464590 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.464598 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.483987 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.484031 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.484106 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.484102 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.484216 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.484262 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.484388 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.484462 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.556544 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.556635 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:45:12 crc kubenswrapper[4733]: E1206 05:45:12.556678 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs podName:7e8909c1-5ab7-4c3f-aba1-436c64849e8a nodeName:}" failed. No retries permitted until 2025-12-06 05:46:16.556663751 +0000 UTC m=+160.421874862 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs") pod "network-metrics-daemon-8fw28" (UID: "7e8909c1-5ab7-4c3f-aba1-436c64849e8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.566481 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.566510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.566518 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.566528 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.566535 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.667891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.667915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.667924 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.667941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.667949 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.769771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.769801 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.769810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.769820 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.769829 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.871791 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.871823 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.871832 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.871843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.871850 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.973657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.973685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.973692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.973701 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:12 crc kubenswrapper[4733]: I1206 05:45:12.973707 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:12Z","lastTransitionTime":"2025-12-06T05:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.075720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.075752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.075759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.075771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.075779 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.177927 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.177958 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.177968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.177978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.177986 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.279761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.279801 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.279811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.279825 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.279833 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.381562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.381596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.381606 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.381617 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.381625 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.483641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.483702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.483713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.483728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.483738 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.585826 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.585881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.585895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.585912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.585924 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.687878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.687927 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.687937 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.687950 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.687958 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.789687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.789718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.789728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.789739 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.789748 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.891975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.892011 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.892019 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.892031 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.892039 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.994297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.994347 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.994356 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.994370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:13 crc kubenswrapper[4733]: I1206 05:45:13.994379 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:13Z","lastTransitionTime":"2025-12-06T05:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.096741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.096775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.096782 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.096797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.096806 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.199141 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.199172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.199180 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.199193 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.199201 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.300322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.300362 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.300372 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.300388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.300397 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.402116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.402157 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.402166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.402181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.402189 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.484097 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.484140 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.484149 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:14 crc kubenswrapper[4733]: E1206 05:45:14.484201 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.484223 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:14 crc kubenswrapper[4733]: E1206 05:45:14.484295 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:14 crc kubenswrapper[4733]: E1206 05:45:14.484383 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:14 crc kubenswrapper[4733]: E1206 05:45:14.484458 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.503922 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.503942 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.503950 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.503959 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.503968 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.605110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.605135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.605144 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.605160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.605170 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.706833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.706886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.706896 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.706909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.706918 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.808252 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.808279 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.808287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.808296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.808325 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.909414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.909456 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.909464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.909475 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:14 crc kubenswrapper[4733]: I1206 05:45:14.909483 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:14Z","lastTransitionTime":"2025-12-06T05:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.011162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.011185 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.011194 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.011203 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.011219 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.112833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.112860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.112870 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.112880 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.112888 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.214944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.214963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.214971 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.214978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.214984 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.316734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.316768 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.316776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.316784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.316791 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.418110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.418179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.418188 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.418197 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.418214 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.520028 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.520064 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.520075 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.520085 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.520097 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.622457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.622487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.622499 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.622510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.622519 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.724575 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.724631 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.724641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.724660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.724673 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.826331 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.826365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.826373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.826387 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.826399 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.928811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.928846 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.928855 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.928868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:15 crc kubenswrapper[4733]: I1206 05:45:15.928875 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:15Z","lastTransitionTime":"2025-12-06T05:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.032095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.032126 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.032194 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.032211 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.032219 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.134526 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.134587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.134598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.134609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.134619 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.236746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.236776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.236788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.236800 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.236811 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.340510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.340541 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.340550 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.340569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.340579 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.442135 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.442158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.442166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.442179 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.442187 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.483743 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.483771 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:16 crc kubenswrapper[4733]: E1206 05:45:16.483840 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.483909 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.483947 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:16 crc kubenswrapper[4733]: E1206 05:45:16.484024 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:16 crc kubenswrapper[4733]: E1206 05:45:16.484397 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:16 crc kubenswrapper[4733]: E1206 05:45:16.484467 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.495586 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.508650 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.508641671 podStartE2EDuration="54.508641671s" podCreationTimestamp="2025-12-06 05:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.508057185 +0000 UTC m=+100.373268295" watchObservedRunningTime="2025-12-06 05:45:16.508641671 +0000 UTC m=+100.373852782" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.508780 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.508776764 podStartE2EDuration="16.508776764s" podCreationTimestamp="2025-12-06 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.498661729 +0000 UTC m=+100.363872841" watchObservedRunningTime="2025-12-06 05:45:16.508776764 +0000 UTC m=+100.373987875" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544477 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544491 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544501 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.544690 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pqsfd" podStartSLOduration=82.544672912 podStartE2EDuration="1m22.544672912s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.544509646 +0000 UTC m=+100.409720756" watchObservedRunningTime="2025-12-06 05:45:16.544672912 +0000 UTC m=+100.409884023" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.551147 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cnxdh" podStartSLOduration=82.551134213 podStartE2EDuration="1m22.551134213s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.550956069 +0000 UTC m=+100.416167180" watchObservedRunningTime="2025-12-06 05:45:16.551134213 +0000 UTC m=+100.416345324" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.586115 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.58609709 podStartE2EDuration="1m22.58609709s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.576926026 +0000 UTC m=+100.442137137" watchObservedRunningTime="2025-12-06 05:45:16.58609709 +0000 UTC m=+100.451308201" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.604241 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-684r5" podStartSLOduration=82.604224366 podStartE2EDuration="1m22.604224366s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.603894196 +0000 UTC m=+100.469105307" watchObservedRunningTime="2025-12-06 05:45:16.604224366 +0000 UTC m=+100.469435477" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.613444 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podStartSLOduration=82.613430706 podStartE2EDuration="1m22.613430706s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.613318736 +0000 UTC m=+100.478529847" watchObservedRunningTime="2025-12-06 05:45:16.613430706 +0000 UTC m=+100.478641817" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.641663 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.641646246 podStartE2EDuration="1m23.641646246s" podCreationTimestamp="2025-12-06 05:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.641405514 +0000 UTC m=+100.506616625" watchObservedRunningTime="2025-12-06 05:45:16.641646246 +0000 UTC m=+100.506857357" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.646673 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.646825 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.646907 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.646973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.647039 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.666699 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5mf9m" podStartSLOduration=82.66667077 podStartE2EDuration="1m22.66667077s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.665386861 +0000 UTC m=+100.530597982" watchObservedRunningTime="2025-12-06 05:45:16.66667077 +0000 UTC m=+100.531881881" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.674690 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q2ktk" podStartSLOduration=81.674667801 podStartE2EDuration="1m21.674667801s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:16.673926351 +0000 UTC m=+100.539137462" watchObservedRunningTime="2025-12-06 05:45:16.674667801 +0000 UTC m=+100.539878912" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.749082 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.749118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.749131 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.749146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.749157 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.850993 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.851041 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.851054 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.851069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.851078 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.952919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.952953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.952963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.952975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:16 crc kubenswrapper[4733]: I1206 05:45:16.952983 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:16Z","lastTransitionTime":"2025-12-06T05:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.054741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.054772 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.054781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.054792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.054801 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.156846 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.156885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.156894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.156908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.156916 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.258674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.258712 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.258723 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.258737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.258747 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.360614 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.360651 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.360660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.360675 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.360685 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.463030 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.463084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.463095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.463116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.463125 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.565010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.565042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.565051 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.565063 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.565091 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.666975 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.667006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.667035 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.667046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.667053 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.768898 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.768940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.768952 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.768967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.768976 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.871165 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.871210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.871220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.871234 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.871243 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.972663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.972722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.972732 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.972754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:17 crc kubenswrapper[4733]: I1206 05:45:17.972768 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:17Z","lastTransitionTime":"2025-12-06T05:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.074921 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.074944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.074953 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.074965 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.074972 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.176991 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.177037 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.177048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.177066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.177078 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.279106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.279163 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.279173 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.279188 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.279217 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.381616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.381647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.381657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.381671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.381681 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484060 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484201 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484211 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484060 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484233 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.484358 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:18 crc kubenswrapper[4733]: E1206 05:45:18.484270 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:18 crc kubenswrapper[4733]: E1206 05:45:18.484368 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:18 crc kubenswrapper[4733]: E1206 05:45:18.484512 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.485025 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:18 crc kubenswrapper[4733]: E1206 05:45:18.485104 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.586432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.586483 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.586492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.586511 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.586555 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.688622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.688646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.688654 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.688665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.688672 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.790881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.790915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.790924 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.790934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.790941 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.892611 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.892637 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.892647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.892658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.892666 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.994225 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.994267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.994277 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.994293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:18 crc kubenswrapper[4733]: I1206 05:45:18.994321 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:18Z","lastTransitionTime":"2025-12-06T05:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.096438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.096467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.096476 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.096488 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.096496 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:19Z","lastTransitionTime":"2025-12-06T05:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.198014 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.198049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.198059 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.198070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.198079 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:19Z","lastTransitionTime":"2025-12-06T05:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.299776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.299807 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.299815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.299824 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.299831 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:19Z","lastTransitionTime":"2025-12-06T05:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.319742 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.319777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.319787 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.319798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.319808 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T05:45:19Z","lastTransitionTime":"2025-12-06T05:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.351364 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5"] Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.351732 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.353283 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.353553 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.353686 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.353838 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.385904 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.385886845 podStartE2EDuration="3.385886845s" podCreationTimestamp="2025-12-06 05:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:19.384934609 +0000 UTC m=+103.250145730" watchObservedRunningTime="2025-12-06 05:45:19.385886845 +0000 UTC m=+103.251097956" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.411180 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f44958d-342c-46af-aa03-7a19678b5b46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.411246 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.411269 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f44958d-342c-46af-aa03-7a19678b5b46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.411299 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f44958d-342c-46af-aa03-7a19678b5b46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.411498 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512613 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f44958d-342c-46af-aa03-7a19678b5b46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512740 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f44958d-342c-46af-aa03-7a19678b5b46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f44958d-342c-46af-aa03-7a19678b5b46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512784 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.512833 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9f44958d-342c-46af-aa03-7a19678b5b46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.513563 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f44958d-342c-46af-aa03-7a19678b5b46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.517449 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f44958d-342c-46af-aa03-7a19678b5b46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.528177 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f44958d-342c-46af-aa03-7a19678b5b46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6th5\" (UID: \"9f44958d-342c-46af-aa03-7a19678b5b46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.665657 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.914450 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" event={"ID":"9f44958d-342c-46af-aa03-7a19678b5b46","Type":"ContainerStarted","Data":"8cc73cf813acd118495c50e18f0cfeeab03cded56d1cb085d11fe7adba2705cf"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.914502 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" event={"ID":"9f44958d-342c-46af-aa03-7a19678b5b46","Type":"ContainerStarted","Data":"9bfb3b8022c76d05e442713269294a0206e9cd848a3208690165b1b8b4b4ce2b"} Dec 06 05:45:19 crc kubenswrapper[4733]: I1206 05:45:19.924537 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6th5" podStartSLOduration=85.924520125 podStartE2EDuration="1m25.924520125s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:19.923718801 +0000 UTC m=+103.788929912" watchObservedRunningTime="2025-12-06 05:45:19.924520125 +0000 UTC m=+103.789731236" Dec 06 05:45:20 crc kubenswrapper[4733]: I1206 05:45:20.483993 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:20 crc kubenswrapper[4733]: I1206 05:45:20.483993 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:20 crc kubenswrapper[4733]: I1206 05:45:20.484127 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:20 crc kubenswrapper[4733]: I1206 05:45:20.484249 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:20 crc kubenswrapper[4733]: E1206 05:45:20.484285 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:20 crc kubenswrapper[4733]: E1206 05:45:20.484470 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:20 crc kubenswrapper[4733]: E1206 05:45:20.484562 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:20 crc kubenswrapper[4733]: E1206 05:45:20.484645 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:22 crc kubenswrapper[4733]: I1206 05:45:22.484153 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:22 crc kubenswrapper[4733]: I1206 05:45:22.484212 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:22 crc kubenswrapper[4733]: I1206 05:45:22.484152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:22 crc kubenswrapper[4733]: E1206 05:45:22.484281 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:22 crc kubenswrapper[4733]: I1206 05:45:22.484153 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:22 crc kubenswrapper[4733]: E1206 05:45:22.484374 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:22 crc kubenswrapper[4733]: E1206 05:45:22.484417 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:22 crc kubenswrapper[4733]: E1206 05:45:22.484453 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:23 crc kubenswrapper[4733]: I1206 05:45:23.484568 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:45:23 crc kubenswrapper[4733]: E1206 05:45:23.484711 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2gb79_openshift-ovn-kubernetes(171aa174-9338-4421-8393-9e23fbab7f1e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" Dec 06 05:45:24 crc kubenswrapper[4733]: I1206 05:45:24.484270 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:24 crc kubenswrapper[4733]: I1206 05:45:24.484327 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:24 crc kubenswrapper[4733]: I1206 05:45:24.484390 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:24 crc kubenswrapper[4733]: E1206 05:45:24.484390 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:24 crc kubenswrapper[4733]: I1206 05:45:24.484404 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:24 crc kubenswrapper[4733]: E1206 05:45:24.484453 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:24 crc kubenswrapper[4733]: E1206 05:45:24.484579 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:24 crc kubenswrapper[4733]: E1206 05:45:24.484605 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:26 crc kubenswrapper[4733]: I1206 05:45:26.484193 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:26 crc kubenswrapper[4733]: I1206 05:45:26.484283 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:26 crc kubenswrapper[4733]: I1206 05:45:26.484355 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:26 crc kubenswrapper[4733]: I1206 05:45:26.484368 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:26 crc kubenswrapper[4733]: E1206 05:45:26.485127 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:26 crc kubenswrapper[4733]: E1206 05:45:26.485278 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:26 crc kubenswrapper[4733]: E1206 05:45:26.485356 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:26 crc kubenswrapper[4733]: E1206 05:45:26.485407 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.933938 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/1.log" Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.934485 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/0.log" Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.934521 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc59542d-ee4a-414d-b096-86716cb56db5" containerID="238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496" exitCode=1 Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.934549 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerDied","Data":"238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496"} Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.934579 4733 scope.go:117] "RemoveContainer" containerID="d7128ab1b2f48b8ce3ecf3a2154cb1b1dc93a58cdfed2c11e7724201a5675ea3" Dec 06 05:45:27 crc kubenswrapper[4733]: I1206 05:45:27.934889 4733 scope.go:117] "RemoveContainer" containerID="238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496" Dec 06 05:45:27 crc kubenswrapper[4733]: E1206 05:45:27.935023 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-684r5_openshift-multus(cc59542d-ee4a-414d-b096-86716cb56db5)\"" pod="openshift-multus/multus-684r5" podUID="cc59542d-ee4a-414d-b096-86716cb56db5" Dec 06 05:45:28 crc kubenswrapper[4733]: I1206 05:45:28.484783 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:28 crc kubenswrapper[4733]: I1206 05:45:28.484819 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:28 crc kubenswrapper[4733]: I1206 05:45:28.484832 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:28 crc kubenswrapper[4733]: E1206 05:45:28.484932 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:28 crc kubenswrapper[4733]: I1206 05:45:28.485013 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:28 crc kubenswrapper[4733]: E1206 05:45:28.485067 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:28 crc kubenswrapper[4733]: E1206 05:45:28.485126 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:28 crc kubenswrapper[4733]: E1206 05:45:28.485270 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:28 crc kubenswrapper[4733]: I1206 05:45:28.937921 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/1.log" Dec 06 05:45:30 crc kubenswrapper[4733]: I1206 05:45:30.484595 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:30 crc kubenswrapper[4733]: I1206 05:45:30.484651 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:30 crc kubenswrapper[4733]: I1206 05:45:30.484749 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:30 crc kubenswrapper[4733]: E1206 05:45:30.484748 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:30 crc kubenswrapper[4733]: E1206 05:45:30.484858 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:30 crc kubenswrapper[4733]: E1206 05:45:30.484938 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:30 crc kubenswrapper[4733]: I1206 05:45:30.485083 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:30 crc kubenswrapper[4733]: E1206 05:45:30.485138 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:32 crc kubenswrapper[4733]: I1206 05:45:32.484111 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:32 crc kubenswrapper[4733]: I1206 05:45:32.484203 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:32 crc kubenswrapper[4733]: I1206 05:45:32.484268 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:32 crc kubenswrapper[4733]: I1206 05:45:32.484283 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:32 crc kubenswrapper[4733]: E1206 05:45:32.484386 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:32 crc kubenswrapper[4733]: E1206 05:45:32.484503 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:32 crc kubenswrapper[4733]: E1206 05:45:32.484555 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:32 crc kubenswrapper[4733]: E1206 05:45:32.484606 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:34 crc kubenswrapper[4733]: I1206 05:45:34.484586 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:34 crc kubenswrapper[4733]: I1206 05:45:34.484624 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:34 crc kubenswrapper[4733]: E1206 05:45:34.484699 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:34 crc kubenswrapper[4733]: I1206 05:45:34.484720 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:34 crc kubenswrapper[4733]: E1206 05:45:34.484841 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:34 crc kubenswrapper[4733]: E1206 05:45:34.484897 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:34 crc kubenswrapper[4733]: I1206 05:45:34.485167 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:34 crc kubenswrapper[4733]: E1206 05:45:34.485258 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.444933 4733 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.484021 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.484051 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.484100 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.484798 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.484794 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.484848 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.485293 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.485276 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.485327 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:36 crc kubenswrapper[4733]: E1206 05:45:36.556890 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.963652 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/3.log" Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.966382 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerStarted","Data":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} Dec 06 05:45:36 crc kubenswrapper[4733]: I1206 05:45:36.967289 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:45:37 crc kubenswrapper[4733]: I1206 05:45:37.163747 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podStartSLOduration=103.163724387 podStartE2EDuration="1m43.163724387s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:36.995768637 +0000 UTC m=+120.860979748" watchObservedRunningTime="2025-12-06 05:45:37.163724387 +0000 UTC m=+121.028935498" Dec 06 05:45:37 crc kubenswrapper[4733]: I1206 05:45:37.164615 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8fw28"] Dec 06 05:45:37 crc kubenswrapper[4733]: I1206 05:45:37.164708 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:37 crc kubenswrapper[4733]: E1206 05:45:37.164790 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:38 crc kubenswrapper[4733]: I1206 05:45:38.484478 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:38 crc kubenswrapper[4733]: I1206 05:45:38.484479 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:38 crc kubenswrapper[4733]: E1206 05:45:38.484598 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:38 crc kubenswrapper[4733]: E1206 05:45:38.484717 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:38 crc kubenswrapper[4733]: I1206 05:45:38.484850 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:38 crc kubenswrapper[4733]: E1206 05:45:38.484915 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:39 crc kubenswrapper[4733]: I1206 05:45:39.484338 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:39 crc kubenswrapper[4733]: E1206 05:45:39.484456 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.484215 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:40 crc kubenswrapper[4733]: E1206 05:45:40.484433 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.484242 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.484636 4733 scope.go:117] "RemoveContainer" containerID="238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.484731 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:40 crc kubenswrapper[4733]: E1206 05:45:40.485013 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:40 crc kubenswrapper[4733]: E1206 05:45:40.485062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.980695 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/1.log" Dec 06 05:45:40 crc kubenswrapper[4733]: I1206 05:45:40.981025 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerStarted","Data":"3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6"} Dec 06 05:45:41 crc kubenswrapper[4733]: I1206 05:45:41.484020 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:41 crc kubenswrapper[4733]: E1206 05:45:41.484145 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:41 crc kubenswrapper[4733]: E1206 05:45:41.558134 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 05:45:42 crc kubenswrapper[4733]: I1206 05:45:42.484141 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:42 crc kubenswrapper[4733]: I1206 05:45:42.484189 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:42 crc kubenswrapper[4733]: E1206 05:45:42.484273 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:42 crc kubenswrapper[4733]: I1206 05:45:42.484141 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:42 crc kubenswrapper[4733]: E1206 05:45:42.484409 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:42 crc kubenswrapper[4733]: E1206 05:45:42.484640 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:43 crc kubenswrapper[4733]: I1206 05:45:43.484776 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:43 crc kubenswrapper[4733]: E1206 05:45:43.484927 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:44 crc kubenswrapper[4733]: I1206 05:45:44.483831 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:44 crc kubenswrapper[4733]: I1206 05:45:44.483900 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:44 crc kubenswrapper[4733]: I1206 05:45:44.483923 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:44 crc kubenswrapper[4733]: E1206 05:45:44.484008 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:44 crc kubenswrapper[4733]: E1206 05:45:44.484157 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:44 crc kubenswrapper[4733]: E1206 05:45:44.484345 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:45 crc kubenswrapper[4733]: I1206 05:45:45.484577 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:45 crc kubenswrapper[4733]: E1206 05:45:45.484674 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8fw28" podUID="7e8909c1-5ab7-4c3f-aba1-436c64849e8a" Dec 06 05:45:46 crc kubenswrapper[4733]: I1206 05:45:46.283613 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:45:46 crc kubenswrapper[4733]: I1206 05:45:46.484741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:46 crc kubenswrapper[4733]: I1206 05:45:46.484802 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:46 crc kubenswrapper[4733]: I1206 05:45:46.484808 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:46 crc kubenswrapper[4733]: E1206 05:45:46.484950 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 05:45:46 crc kubenswrapper[4733]: E1206 05:45:46.485038 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 05:45:46 crc kubenswrapper[4733]: E1206 05:45:46.485134 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 05:45:47 crc kubenswrapper[4733]: I1206 05:45:47.484153 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:45:47 crc kubenswrapper[4733]: I1206 05:45:47.487380 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 05:45:47 crc kubenswrapper[4733]: I1206 05:45:47.488277 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.484479 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.484479 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.484492 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.487742 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.487934 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.488506 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 05:45:48 crc kubenswrapper[4733]: I1206 05:45:48.488556 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.084291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.111562 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hxt8v"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.112048 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.112210 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.115954 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.116740 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.117026 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.124690 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.124845 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125257 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125391 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125499 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6l9dt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125610 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125656 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.125829 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.126033 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.126085 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.126706 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127019 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7tsr2"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127269 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127377 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127464 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127484 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127773 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.127825 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.128485 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.128594 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.131915 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132125 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132220 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132386 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132423 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132449 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132553 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132664 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132391 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132674 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.132971 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133112 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133167 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133200 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133279 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133377 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133448 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133538 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.133757 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.134546 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.134631 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.136650 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.138149 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.138477 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.138895 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.139265 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.139810 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fm22"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.140139 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.140421 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.140687 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.141498 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5rtwt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.141841 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.143158 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.143781 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.143874 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.144251 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.144468 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.144660 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntght"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.144797 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.145053 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.145208 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.145570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.145750 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-m2w78"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.146095 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.153836 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.155238 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.155970 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t668l"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.156084 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.156180 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.162518 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.168245 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.168530 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.177968 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.178244 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.178961 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.179585 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.179968 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.180072 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.180318 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.183316 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.183795 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.184265 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.184666 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.184786 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.184674 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.192954 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.193113 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.193498 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.194141 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.194466 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t8xrv"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.194530 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.194648 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.194999 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202205 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202452 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202626 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202772 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202883 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.202980 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203168 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203277 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203392 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203481 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203589 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203682 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203769 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.203841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.205381 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.205612 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.205971 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.206121 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.206455 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.213394 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.214125 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.214240 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.214254 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbnvh"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.214999 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.215910 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.216481 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.216656 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.217282 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.218271 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tdjqn"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.218877 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.218995 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bbq6p"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.219808 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.221463 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7tsr2"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.228568 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.228802 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.228937 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.228453 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.233397 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.233412 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.234022 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.234712 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.234942 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.239273 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.240190 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6l9dt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.254761 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.255547 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.255676 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.257154 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.258804 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.259176 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.259590 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hxt8v"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.260188 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.261467 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.263547 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.263674 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.263803 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.263963 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.264002 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.264122 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.264169 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.264348 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.264487 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.263548 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.266391 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.267006 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.267800 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.267812 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.268166 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.268345 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.268467 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.269726 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.270081 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.270155 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.270087 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.279415 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.279480 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.279522 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.279574 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.280391 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.280444 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.280497 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.281363 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.281573 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.282174 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.282659 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fm22"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.282695 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.282974 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.283817 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.284936 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.285094 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswd9\" (UniqueName: \"kubernetes.io/projected/549f5614-6b98-454e-970a-e623fd4ec9a8-kube-api-access-gswd9\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.285131 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f5614-6b98-454e-970a-e623fd4ec9a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.285176 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f5614-6b98-454e-970a-e623fd4ec9a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.285652 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.287279 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.288892 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.292449 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.292536 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.292624 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.292697 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.295583 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.297110 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.297931 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.298077 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.299843 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.301155 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.305698 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntght"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.306263 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rtwt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.308072 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.309099 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-m2w78"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.310140 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t8xrv"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.310979 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.311848 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.313628 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.318035 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rlgkt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.319463 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.323373 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cdfjx"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.324199 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.326164 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.327493 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.328636 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.329550 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.331909 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.333787 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.333968 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.335604 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.336695 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.340628 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tdjqn"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.342416 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.343644 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.344434 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bbq6p"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.345681 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.346983 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbnvh"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.348540 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rlgkt"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.349502 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mjw9v"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.350212 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.350520 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mjw9v"] Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.354041 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.374355 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.385734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f5614-6b98-454e-970a-e623fd4ec9a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.385800 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswd9\" (UniqueName: \"kubernetes.io/projected/549f5614-6b98-454e-970a-e623fd4ec9a8-kube-api-access-gswd9\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.385846 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f5614-6b98-454e-970a-e623fd4ec9a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.386688 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f5614-6b98-454e-970a-e623fd4ec9a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.390707 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f5614-6b98-454e-970a-e623fd4ec9a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.400608 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.414107 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.434370 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.454886 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.474336 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.494363 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.514600 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.534295 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.554409 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.574609 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.594274 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.614163 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.634632 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.653961 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.674352 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.694703 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.713900 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.734869 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.754598 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.774349 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.793936 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.814392 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.833993 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.853842 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.874268 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.893884 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.915774 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.934611 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.954517 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.973797 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 05:45:50 crc kubenswrapper[4733]: I1206 05:45:50.994803 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.014297 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.033961 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.054460 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.074004 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.094869 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.114176 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.134435 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.154589 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.173903 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.193380 4733 request.go:700] Waited for 1.008282294s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.194452 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.214116 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.234654 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.253849 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.275533 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.294027 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.314101 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.334408 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.374552 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.394481 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.413995 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.433971 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.453687 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.474919 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.494621 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.514089 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.534548 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.554047 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.574155 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.593889 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.614385 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.640484 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.654299 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.674941 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.694743 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.714152 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.734462 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.754837 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.774356 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.793895 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.814664 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.834521 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.853921 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.874373 4733 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.894149 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.914952 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.934206 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.974045 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 05:45:51 crc kubenswrapper[4733]: I1206 05:45:51.993939 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.014376 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.034152 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.054638 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.073835 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.094656 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.114325 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.134041 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.154482 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.188424 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswd9\" (UniqueName: \"kubernetes.io/projected/549f5614-6b98-454e-970a-e623fd4ec9a8-kube-api-access-gswd9\") pod \"openshift-controller-manager-operator-756b6f6bc6-kj8gq\" (UID: \"549f5614-6b98-454e-970a-e623fd4ec9a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199516 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199562 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612ba8e8-eee6-4b30-967b-d838fb05147e-config\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199609 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199646 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199703 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199740 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199764 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcktb\" (UniqueName: \"kubernetes.io/projected/eb1a5702-c3b2-4f2f-997d-585725f89e4a-kube-api-access-jcktb\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199798 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-kube-api-access-dwzwd\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199819 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-service-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199847 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24c32dd0-469b-4cd4-9468-92604fbec4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199878 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqmvh\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-kube-api-access-zqmvh\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199921 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199956 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0da215-5c31-4c91-939c-77e95ab4a568-service-ca-bundle\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.199992 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlhlb\" (UniqueName: \"kubernetes.io/projected/6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d-kube-api-access-tlhlb\") pod \"downloads-7954f5f757-5rtwt\" (UID: \"6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d\") " pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200014 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200038 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200069 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-trusted-ca\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200090 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4895\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200114 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200134 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ldgs\" (UniqueName: \"kubernetes.io/projected/1a84775b-e1f9-4699-af95-16a181527cf2-kube-api-access-8ldgs\") pod \"migrator-59844c95c7-jfdvl\" (UID: \"1a84775b-e1f9-4699-af95-16a181527cf2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200152 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612ba8e8-eee6-4b30-967b-d838fb05147e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200170 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-policies\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200191 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567vh\" (UniqueName: \"kubernetes.io/projected/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-kube-api-access-567vh\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200207 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200223 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-serving-cert\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200244 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcfc\" (UniqueName: \"kubernetes.io/projected/a35279b0-48f4-49e5-af04-c471474695f1-kube-api-access-lxcfc\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200263 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-metrics-tls\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200284 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200319 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200337 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1690f8e-c151-4c06-b52e-b51e769af54f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200368 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9bg\" (UniqueName: \"kubernetes.io/projected/497bc6aa-1f54-4fd3-b2af-eb564609b96e-kube-api-access-fg9bg\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200387 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200412 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71229745-aa94-4aa5-90c8-95d65fcca563-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200432 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/883d952f-b02d-4b4c-b686-f77b921c77ae-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200592 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24c32dd0-469b-4cd4-9468-92604fbec4a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200640 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d330f5cc-abab-4367-902f-97e41685007f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200667 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-encryption-config\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200708 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200743 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-audit\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200774 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-images\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200797 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497bc6aa-1f54-4fd3-b2af-eb564609b96e-serving-cert\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200820 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200847 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1a5702-c3b2-4f2f-997d-585725f89e4a-serving-cert\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200865 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200882 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612ba8e8-eee6-4b30-967b-d838fb05147e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200900 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-client\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200925 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a420a769-7492-4f06-ad4a-f4126e155429-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200947 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h884\" (UniqueName: \"kubernetes.io/projected/d330f5cc-abab-4367-902f-97e41685007f-kube-api-access-8h884\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200968 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.200986 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvp8\" (UniqueName: \"kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201018 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1690f8e-c151-4c06-b52e-b51e769af54f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201041 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/883d952f-b02d-4b4c-b686-f77b921c77ae-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201068 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-audit-dir\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201099 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-config\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201116 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201134 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wfxc\" (UniqueName: \"kubernetes.io/projected/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-kube-api-access-9wfxc\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201189 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-dir\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201220 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a35279b0-48f4-49e5-af04-c471474695f1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201240 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdmrc\" (UniqueName: \"kubernetes.io/projected/ca0da215-5c31-4c91-939c-77e95ab4a568-kube-api-access-cdmrc\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201261 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201282 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-serving-cert\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201322 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1646b426-e8f8-48bd-83b1-919eb5c8466f-machine-approver-tls\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201346 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-images\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201366 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201385 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-proxy-tls\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201430 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201458 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201492 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201515 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-encryption-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201564 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chm9m\" (UniqueName: \"kubernetes.io/projected/71229745-aa94-4aa5-90c8-95d65fcca563-kube-api-access-chm9m\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201583 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-trusted-ca\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201616 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201658 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-auth-proxy-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4km5q\" (UniqueName: \"kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201700 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75a142e-dce5-4bf9-83da-25f46752b08f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201716 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201736 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201761 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35279b0-48f4-49e5-af04-c471474695f1-serving-cert\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201787 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzvh\" (UniqueName: \"kubernetes.io/projected/b75a142e-dce5-4bf9-83da-25f46752b08f-kube-api-access-pmzvh\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201808 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201876 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-image-import-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.201975 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497bc6aa-1f54-4fd3-b2af-eb564609b96e-config\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202008 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v8l2\" (UniqueName: \"kubernetes.io/projected/1646b426-e8f8-48bd-83b1-919eb5c8466f-kube-api-access-8v8l2\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202027 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-config\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202047 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202079 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2wq\" (UniqueName: \"kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202100 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk6h7\" (UniqueName: \"kubernetes.io/projected/92d3e212-f442-49de-af58-7a4efc70a68f-kube-api-access-jk6h7\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202121 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202140 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1690f8e-c151-4c06-b52e-b51e769af54f-config\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202158 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-client\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202182 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9bf9\" (UniqueName: \"kubernetes.io/projected/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-kube-api-access-k9bf9\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202203 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgv8w\" (UniqueName: \"kubernetes.io/projected/3641143a-a3c8-4ff3-8f5f-783d428411ae-kube-api-access-jgv8w\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202224 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-stats-auth\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202252 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcgk\" (UniqueName: \"kubernetes.io/projected/24c32dd0-469b-4cd4-9468-92604fbec4a1-kube-api-access-vfcgk\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202275 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-proxy-tls\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202298 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202358 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202387 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-serving-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202409 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a420a769-7492-4f06-ad4a-f4126e155429-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202429 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202462 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-config\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202479 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d3e212-f442-49de-af58-7a4efc70a68f-serving-cert\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202527 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8snn\" (UniqueName: \"kubernetes.io/projected/34ef48cd-203c-41e4-99ce-64d24203d4c0-kube-api-access-s8snn\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202588 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202618 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-node-pullsecrets\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202645 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202680 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202704 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xxf\" (UniqueName: \"kubernetes.io/projected/883d952f-b02d-4b4c-b686-f77b921c77ae-kube-api-access-m4xxf\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202750 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-config\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202769 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-serving-cert\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202789 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202807 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-client\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202826 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-default-certificate\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202858 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202876 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34ef48cd-203c-41e4-99ce-64d24203d4c0-metrics-tls\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202899 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-metrics-certs\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202920 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-auth-proxy-config\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202943 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202963 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72rns\" (UniqueName: \"kubernetes.io/projected/4416c700-40b6-4e24-b003-6e503a8c8533-kube-api-access-72rns\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.202997 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.203022 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.203042 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd2nf\" (UniqueName: \"kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.203070 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.203266 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:52.703249523 +0000 UTC m=+136.568460634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.303866 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.304031 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:52.804013576 +0000 UTC m=+136.669224676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304099 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-node-pullsecrets\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304153 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304173 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304198 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304216 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-node-pullsecrets\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304234 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304284 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-metrics-tls\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304387 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-config\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304425 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xxf\" (UniqueName: \"kubernetes.io/projected/883d952f-b02d-4b4c-b686-f77b921c77ae-kube-api-access-m4xxf\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304447 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-tmpfs\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304473 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304493 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-client\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304510 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-serving-cert\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304540 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304559 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34ef48cd-203c-41e4-99ce-64d24203d4c0-metrics-tls\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-default-certificate\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304603 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-auth-proxy-config\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304624 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304644 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-metrics-certs\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304671 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqlv\" (UniqueName: \"kubernetes.io/projected/36ff56af-0a08-47c1-acc9-699aad3cd439-kube-api-access-rkqlv\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304690 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qnbt\" (UniqueName: \"kubernetes.io/projected/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-kube-api-access-4qnbt\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304708 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304740 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-plugins-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304762 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304782 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72rns\" (UniqueName: \"kubernetes.io/projected/4416c700-40b6-4e24-b003-6e503a8c8533-kube-api-access-72rns\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304801 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqwb\" (UniqueName: \"kubernetes.io/projected/aca17857-561a-4f8c-b778-dac5aec3f04f-kube-api-access-rgqwb\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304820 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd2nf\" (UniqueName: \"kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304870 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304887 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612ba8e8-eee6-4b30-967b-d838fb05147e-config\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304903 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304922 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304940 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304942 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304959 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcktb\" (UniqueName: \"kubernetes.io/projected/eb1a5702-c3b2-4f2f-997d-585725f89e4a-kube-api-access-jcktb\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.304981 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305001 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305022 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-kube-api-access-dwzwd\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305040 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-service-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305067 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqmvh\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-kube-api-access-zqmvh\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305084 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e22cc297-d62d-4a07-8131-062668e5b69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305117 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305123 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24c32dd0-469b-4cd4-9468-92604fbec4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305140 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305151 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlhlb\" (UniqueName: \"kubernetes.io/projected/6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d-kube-api-access-tlhlb\") pod \"downloads-7954f5f757-5rtwt\" (UID: \"6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d\") " pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305173 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0da215-5c31-4c91-939c-77e95ab4a568-service-ca-bundle\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305220 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305250 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-trusted-ca\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305281 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-srv-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305331 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpq6\" (UniqueName: \"kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.305360 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4895\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306025 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306565 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306658 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306713 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ldgs\" (UniqueName: \"kubernetes.io/projected/1a84775b-e1f9-4699-af95-16a181527cf2-kube-api-access-8ldgs\") pod \"migrator-59844c95c7-jfdvl\" (UID: \"1a84775b-e1f9-4699-af95-16a181527cf2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306788 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306815 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612ba8e8-eee6-4b30-967b-d838fb05147e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306836 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-policies\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306868 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ff56af-0a08-47c1-acc9-699aad3cd439-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-serving-cert\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306926 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567vh\" (UniqueName: \"kubernetes.io/projected/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-kube-api-access-567vh\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306944 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-metrics-tls\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306983 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306998 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0da215-5c31-4c91-939c-77e95ab4a568-service-ca-bundle\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307013 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcfc\" (UniqueName: \"kubernetes.io/projected/a35279b0-48f4-49e5-af04-c471474695f1-kube-api-access-lxcfc\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307247 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-config\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307392 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-cabundle\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307448 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-registration-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307452 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307500 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-csi-data-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.307581 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-trusted-ca\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.308363 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-service-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.306713 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612ba8e8-eee6-4b30-967b-d838fb05147e-config\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.308598 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.308722 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.308726 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.309245 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:52.809223909 +0000 UTC m=+136.674435020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309491 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309724 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309735 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-policies\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309875 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-auth-proxy-config\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309954 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.309961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.312961 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-default-certificate\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.313173 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.313519 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24c32dd0-469b-4cd4-9468-92604fbec4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.310001 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1690f8e-c151-4c06-b52e-b51e769af54f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.313731 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.313848 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314080 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314418 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34ef48cd-203c-41e4-99ce-64d24203d4c0-metrics-tls\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314557 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9bg\" (UniqueName: \"kubernetes.io/projected/497bc6aa-1f54-4fd3-b2af-eb564609b96e-kube-api-access-fg9bg\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314542 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-serving-cert\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314614 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71229745-aa94-4aa5-90c8-95d65fcca563-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/883d952f-b02d-4b4c-b686-f77b921c77ae-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314763 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-metrics-certs\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314866 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314898 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.314905 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315184 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24c32dd0-469b-4cd4-9468-92604fbec4a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315220 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d330f5cc-abab-4367-902f-97e41685007f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315277 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-metrics-tls\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315329 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315483 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdmd\" (UniqueName: \"kubernetes.io/projected/8b7a185f-09db-4aa3-9ece-15e0b7a21098-kube-api-access-njdmd\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315514 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315515 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315559 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-encryption-config\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315695 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-etcd-client\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315714 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-audit\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315813 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24c32dd0-469b-4cd4-9468-92604fbec4a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315941 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315948 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-images\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.315989 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497bc6aa-1f54-4fd3-b2af-eb564609b96e-serving-cert\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316232 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316297 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316329 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-audit\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316353 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-node-bootstrap-token\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316384 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1a5702-c3b2-4f2f-997d-585725f89e4a-serving-cert\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316408 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316430 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612ba8e8-eee6-4b30-967b-d838fb05147e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316502 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-config-volume\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316549 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a420a769-7492-4f06-ad4a-f4126e155429-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316569 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-client\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.316596 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h884\" (UniqueName: \"kubernetes.io/projected/d330f5cc-abab-4367-902f-97e41685007f-kube-api-access-8h884\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.318236 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-images\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.318589 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1690f8e-c151-4c06-b52e-b51e769af54f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.318919 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-serving-cert\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.318950 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d330f5cc-abab-4367-902f-97e41685007f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319000 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-key\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319047 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319069 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78cv\" (UniqueName: \"kubernetes.io/projected/386f85c9-e984-405c-af7a-225fb5bcfcaf-kube-api-access-n78cv\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319100 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvp8\" (UniqueName: \"kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1690f8e-c151-4c06-b52e-b51e769af54f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319187 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/883d952f-b02d-4b4c-b686-f77b921c77ae-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319216 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319250 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-config\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319272 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-audit-dir\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319385 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497bc6aa-1f54-4fd3-b2af-eb564609b96e-serving-cert\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319462 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wfxc\" (UniqueName: \"kubernetes.io/projected/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-kube-api-access-9wfxc\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319502 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-webhook-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319620 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319470 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4416c700-40b6-4e24-b003-6e503a8c8533-audit-dir\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319736 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/883d952f-b02d-4b4c-b686-f77b921c77ae-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.319756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1a5702-c3b2-4f2f-997d-585725f89e4a-config\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320047 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-dir\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320105 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a35279b0-48f4-49e5-af04-c471474695f1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320139 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320170 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320196 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-serving-cert\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320218 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1646b426-e8f8-48bd-83b1-919eb5c8466f-machine-approver-tls\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320239 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdmrc\" (UniqueName: \"kubernetes.io/projected/ca0da215-5c31-4c91-939c-77e95ab4a568-kube-api-access-cdmrc\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320268 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc6wv\" (UniqueName: \"kubernetes.io/projected/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-kube-api-access-nc6wv\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320296 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-images\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320338 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tdz\" (UniqueName: \"kubernetes.io/projected/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-kube-api-access-k9tdz\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320362 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-proxy-tls\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320429 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320470 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbvfs\" (UniqueName: \"kubernetes.io/projected/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-kube-api-access-wbvfs\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320492 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-encryption-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320519 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320520 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/883d952f-b02d-4b4c-b686-f77b921c77ae-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320540 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-cert\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320563 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320592 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320610 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-trusted-ca\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320634 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320656 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-profile-collector-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320678 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320717 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chm9m\" (UniqueName: \"kubernetes.io/projected/71229745-aa94-4aa5-90c8-95d65fcca563-kube-api-access-chm9m\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320740 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4km5q\" (UniqueName: \"kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320763 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75a142e-dce5-4bf9-83da-25f46752b08f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320779 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-auth-proxy-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320801 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzvh\" (UniqueName: \"kubernetes.io/projected/b75a142e-dce5-4bf9-83da-25f46752b08f-kube-api-access-pmzvh\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320821 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320858 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320881 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320902 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35279b0-48f4-49e5-af04-c471474695f1-serving-cert\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320929 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320951 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-image-import-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320977 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497bc6aa-1f54-4fd3-b2af-eb564609b96e-config\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.320995 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-mountpoint-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321022 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-config\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321046 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz52\" (UniqueName: \"kubernetes.io/projected/e22cc297-d62d-4a07-8131-062668e5b69a-kube-api-access-ctz52\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321077 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321098 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v8l2\" (UniqueName: \"kubernetes.io/projected/1646b426-e8f8-48bd-83b1-919eb5c8466f-kube-api-access-8v8l2\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321110 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d330f5cc-abab-4367-902f-97e41685007f-images\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321121 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2wq\" (UniqueName: \"kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321149 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk6h7\" (UniqueName: \"kubernetes.io/projected/92d3e212-f442-49de-af58-7a4efc70a68f-kube-api-access-jk6h7\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321166 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-audit-dir\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321173 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1690f8e-c151-4c06-b52e-b51e769af54f-config\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321206 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-client\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321229 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9bf9\" (UniqueName: \"kubernetes.io/projected/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-kube-api-access-k9bf9\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321251 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgv8w\" (UniqueName: \"kubernetes.io/projected/3641143a-a3c8-4ff3-8f5f-783d428411ae-kube-api-access-jgv8w\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321276 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321329 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a420a769-7492-4f06-ad4a-f4126e155429-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321338 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld49h\" (UniqueName: \"kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321364 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-stats-auth\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321385 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-srv-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321411 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcgk\" (UniqueName: \"kubernetes.io/projected/24c32dd0-469b-4cd4-9468-92604fbec4a1-kube-api-access-vfcgk\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321435 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321462 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321482 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-proxy-tls\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321512 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-socket-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321539 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321561 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-serving-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321578 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-certs\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321599 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321621 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-config\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321669 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d3e212-f442-49de-af58-7a4efc70a68f-serving-cert\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321694 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8snn\" (UniqueName: \"kubernetes.io/projected/34ef48cd-203c-41e4-99ce-64d24203d4c0-kube-api-access-s8snn\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321705 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1690f8e-c151-4c06-b52e-b51e769af54f-config\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321733 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a420a769-7492-4f06-ad4a-f4126e155429-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321762 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6sx\" (UniqueName: \"kubernetes.io/projected/b396c172-2cca-48e0-85bd-192fe03d0f93-kube-api-access-ht6sx\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.321990 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a35279b0-48f4-49e5-af04-c471474695f1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322372 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1646b426-e8f8-48bd-83b1-919eb5c8466f-auth-proxy-config\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322457 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612ba8e8-eee6-4b30-967b-d838fb05147e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322525 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71229745-aa94-4aa5-90c8-95d65fcca563-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322732 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322793 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1a5702-c3b2-4f2f-997d-585725f89e4a-serving-cert\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.322927 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.323195 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.323490 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-encryption-config\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.323517 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.323811 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.324278 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-config\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.324366 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.324791 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-trusted-ca\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325082 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325086 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497bc6aa-1f54-4fd3-b2af-eb564609b96e-config\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325337 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325356 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-ca\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325702 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35279b0-48f4-49e5-af04-c471474695f1-serving-cert\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.325953 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-image-import-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.326205 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75a142e-dce5-4bf9-83da-25f46752b08f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.326208 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.326352 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.326625 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.326986 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-proxy-tls\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.327041 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.327470 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-proxy-tls\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.327531 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d3e212-f442-49de-af58-7a4efc70a68f-config\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.327771 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-serving-ca\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.328141 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a420a769-7492-4f06-ad4a-f4126e155429-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.328768 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1646b426-e8f8-48bd-83b1-919eb5c8466f-machine-approver-tls\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.328901 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.329438 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92d3e212-f442-49de-af58-7a4efc70a68f-serving-cert\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.329742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-serving-cert\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.330406 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.330449 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-encryption-config\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.330509 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4416c700-40b6-4e24-b003-6e503a8c8533-etcd-client\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.330598 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.330789 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641143a-a3c8-4ff3-8f5f-783d428411ae-etcd-client\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.331067 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0da215-5c31-4c91-939c-77e95ab4a568-stats-auth\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.331450 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.345260 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72rns\" (UniqueName: \"kubernetes.io/projected/4416c700-40b6-4e24-b003-6e503a8c8533-kube-api-access-72rns\") pod \"apiserver-76f77b778f-hxt8v\" (UID: \"4416c700-40b6-4e24-b003-6e503a8c8533\") " pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.365519 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcktb\" (UniqueName: \"kubernetes.io/projected/eb1a5702-c3b2-4f2f-997d-585725f89e4a-kube-api-access-jcktb\") pod \"authentication-operator-69f744f599-8fm22\" (UID: \"eb1a5702-c3b2-4f2f-997d-585725f89e4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.384351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.385241 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd2nf\" (UniqueName: \"kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf\") pod \"console-f9d7485db-tbqkj\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.405376 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xxf\" (UniqueName: \"kubernetes.io/projected/883d952f-b02d-4b4c-b686-f77b921c77ae-kube-api-access-m4xxf\") pod \"openshift-apiserver-operator-796bbdcf4f-8mlc5\" (UID: \"883d952f-b02d-4b4c-b686-f77b921c77ae\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.414544 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.422874 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423252 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-certs\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423320 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6sx\" (UniqueName: \"kubernetes.io/projected/b396c172-2cca-48e0-85bd-192fe03d0f93-kube-api-access-ht6sx\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423345 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-metrics-tls\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423368 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423390 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-tmpfs\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423418 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqlv\" (UniqueName: \"kubernetes.io/projected/36ff56af-0a08-47c1-acc9-699aad3cd439-kube-api-access-rkqlv\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423438 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-plugins-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423455 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qnbt\" (UniqueName: \"kubernetes.io/projected/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-kube-api-access-4qnbt\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423475 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423496 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqwb\" (UniqueName: \"kubernetes.io/projected/aca17857-561a-4f8c-b778-dac5aec3f04f-kube-api-access-rgqwb\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423541 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e22cc297-d62d-4a07-8131-062668e5b69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423577 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-srv-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423611 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpq6\" (UniqueName: \"kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423650 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ff56af-0a08-47c1-acc9-699aad3cd439-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423688 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423707 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-csi-data-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-cabundle\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423750 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-registration-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423793 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdmd\" (UniqueName: \"kubernetes.io/projected/8b7a185f-09db-4aa3-9ece-15e0b7a21098-kube-api-access-njdmd\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423834 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423856 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-config-volume\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423877 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-node-bootstrap-token\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423882 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-tmpfs\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423909 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-key\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423958 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78cv\" (UniqueName: \"kubernetes.io/projected/386f85c9-e984-405c-af7a-225fb5bcfcaf-kube-api-access-n78cv\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.423995 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-webhook-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424025 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424034 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-plugins-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424046 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc6wv\" (UniqueName: \"kubernetes.io/projected/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-kube-api-access-nc6wv\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424151 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tdz\" (UniqueName: \"kubernetes.io/projected/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-kube-api-access-k9tdz\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424203 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbvfs\" (UniqueName: \"kubernetes.io/projected/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-kube-api-access-wbvfs\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424231 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424259 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-cert\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424282 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-profile-collector-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424331 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-mountpoint-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424437 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz52\" (UniqueName: \"kubernetes.io/projected/e22cc297-d62d-4a07-8131-062668e5b69a-kube-api-access-ctz52\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424489 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld49h\" (UniqueName: \"kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424522 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-srv-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424553 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-socket-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.424709 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-socket-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.425272 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:52.925247092 +0000 UTC m=+136.790458194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.425416 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-mountpoint-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.425492 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-registration-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.425560 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.425620 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/386f85c9-e984-405c-af7a-225fb5bcfcaf-csi-data-dir\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.425782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-cabundle\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.426424 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-config-volume\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.426957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.427418 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.427978 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.428614 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-srv-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.428719 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-certs\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.429233 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.429331 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-webhook-cert\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.429466 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-metrics-tls\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.429835 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e22cc297-d62d-4a07-8131-062668e5b69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.430002 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.430029 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b396c172-2cca-48e0-85bd-192fe03d0f93-node-bootstrap-token\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.430048 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.430475 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-signing-key\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.430653 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aca17857-561a-4f8c-b778-dac5aec3f04f-srv-cert\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.431604 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ff56af-0a08-47c1-acc9-699aad3cd439-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.431734 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8b7a185f-09db-4aa3-9ece-15e0b7a21098-profile-collector-cert\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.431751 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.431762 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-cert\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.433194 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.447526 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4895\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.466969 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcfc\" (UniqueName: \"kubernetes.io/projected/a35279b0-48f4-49e5-af04-c471474695f1-kube-api-access-lxcfc\") pod \"openshift-config-operator-7777fb866f-dk7bw\" (UID: \"a35279b0-48f4-49e5-af04-c471474695f1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.487689 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqmvh\" (UniqueName: \"kubernetes.io/projected/a420a769-7492-4f06-ad4a-f4126e155429-kube-api-access-zqmvh\") pod \"cluster-image-registry-operator-dc59b4c8b-4799x\" (UID: \"a420a769-7492-4f06-ad4a-f4126e155429\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.506851 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ldgs\" (UniqueName: \"kubernetes.io/projected/1a84775b-e1f9-4699-af95-16a181527cf2-kube-api-access-8ldgs\") pod \"migrator-59844c95c7-jfdvl\" (UID: \"1a84775b-e1f9-4699-af95-16a181527cf2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.524390 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fm22"] Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.525851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.526160 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.026144896 +0000 UTC m=+136.891356007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.527765 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612ba8e8-eee6-4b30-967b-d838fb05147e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drkhf\" (UID: \"612ba8e8-eee6-4b30-967b-d838fb05147e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.536337 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.546750 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-kube-api-access-dwzwd\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.553045 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq"] Dec 06 05:45:52 crc kubenswrapper[4733]: W1206 05:45:52.562854 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549f5614_6b98_454e_970a_e623fd4ec9a8.slice/crio-dfe032a0f0c187841b1db69668451717ca311700c105c6eee8c3d20ce6317bda WatchSource:0}: Error finding container dfe032a0f0c187841b1db69668451717ca311700c105c6eee8c3d20ce6317bda: Status 404 returned error can't find the container with id dfe032a0f0c187841b1db69668451717ca311700c105c6eee8c3d20ce6317bda Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.566742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567vh\" (UniqueName: \"kubernetes.io/projected/1fbc70fe-f193-4a2e-9a9f-2981b6c72a56-kube-api-access-567vh\") pod \"machine-config-operator-74547568cd-x4xrx\" (UID: \"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.586760 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlhlb\" (UniqueName: \"kubernetes.io/projected/6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d-kube-api-access-tlhlb\") pod \"downloads-7954f5f757-5rtwt\" (UID: \"6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d\") " pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.607011 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9bg\" (UniqueName: \"kubernetes.io/projected/497bc6aa-1f54-4fd3-b2af-eb564609b96e-kube-api-access-fg9bg\") pod \"service-ca-operator-777779d784-kb6cg\" (UID: \"497bc6aa-1f54-4fd3-b2af-eb564609b96e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.608023 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.616141 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.626617 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.626796 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.126767834 +0000 UTC m=+136.991978945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.627067 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.627471 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.127456478 +0000 UTC m=+136.992667590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.635731 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.650696 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h884\" (UniqueName: \"kubernetes.io/projected/d330f5cc-abab-4367-902f-97e41685007f-kube-api-access-8h884\") pod \"machine-api-operator-5694c8668f-6l9dt\" (UID: \"d330f5cc-abab-4367-902f-97e41685007f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.665398 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.665443 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1690f8e-c151-4c06-b52e-b51e769af54f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4cpb6\" (UID: \"d1690f8e-c151-4c06-b52e-b51e769af54f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.696508 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hxt8v"] Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.699127 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba35ba23-fd5c-47ca-bf00-dd4dcede4997-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mgwdw\" (UID: \"ba35ba23-fd5c-47ca-bf00-dd4dcede4997\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.701012 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.707753 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.708287 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wfxc\" (UniqueName: \"kubernetes.io/projected/68e5cd26-b9f4-48c0-a6e1-53d27816aa67-kube-api-access-9wfxc\") pod \"machine-config-controller-84d6567774-m8ljh\" (UID: \"68e5cd26-b9f4-48c0-a6e1-53d27816aa67\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: W1206 05:45:52.710011 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4416c700_40b6_4e24_b003_6e503a8c8533.slice/crio-a5f17c507c99ffc97bfd6190737a49e5f95ddef52e213516942c0320c2379f20 WatchSource:0}: Error finding container a5f17c507c99ffc97bfd6190737a49e5f95ddef52e213516942c0320c2379f20: Status 404 returned error can't find the container with id a5f17c507c99ffc97bfd6190737a49e5f95ddef52e213516942c0320c2379f20 Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.726721 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.727694 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvp8\" (UniqueName: \"kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8\") pod \"controller-manager-879f6c89f-jpw8l\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.727719 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.727790 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.227772809 +0000 UTC m=+137.092983920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.728224 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.728609 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.228593723 +0000 UTC m=+137.093804834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.750464 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.773920 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzvh\" (UniqueName: \"kubernetes.io/projected/b75a142e-dce5-4bf9-83da-25f46752b08f-kube-api-access-pmzvh\") pod \"cluster-samples-operator-665b6dd947-b8mkn\" (UID: \"b75a142e-dce5-4bf9-83da-25f46752b08f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.774441 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcgk\" (UniqueName: \"kubernetes.io/projected/24c32dd0-469b-4cd4-9468-92604fbec4a1-kube-api-access-vfcgk\") pod \"kube-storage-version-migrator-operator-b67b599dd-7njc6\" (UID: \"24c32dd0-469b-4cd4-9468-92604fbec4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.777389 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.787019 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v8l2\" (UniqueName: \"kubernetes.io/projected/1646b426-e8f8-48bd-83b1-919eb5c8466f-kube-api-access-8v8l2\") pod \"machine-approver-56656f9798-97hmm\" (UID: \"1646b426-e8f8-48bd-83b1-919eb5c8466f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.789583 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.793163 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x"] Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.807036 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.808568 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2wq\" (UniqueName: \"kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq\") pod \"oauth-openshift-558db77b4-l8cj4\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.829214 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk6h7\" (UniqueName: \"kubernetes.io/projected/92d3e212-f442-49de-af58-7a4efc70a68f-kube-api-access-jk6h7\") pod \"console-operator-58897d9998-7tsr2\" (UID: \"92d3e212-f442-49de-af58-7a4efc70a68f\") " pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.829512 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.829810 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.329793364 +0000 UTC m=+137.195004475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.829999 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.830335 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.330295247 +0000 UTC m=+137.195506358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: W1206 05:45:52.831642 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda420a769_7492_4f06_ad4a_f4126e155429.slice/crio-ace3b210273da1975d5b8d4a05d4b5da930e68000f92f9dfe4d5de6fcef73b6a WatchSource:0}: Error finding container ace3b210273da1975d5b8d4a05d4b5da930e68000f92f9dfe4d5de6fcef73b6a: Status 404 returned error can't find the container with id ace3b210273da1975d5b8d4a05d4b5da930e68000f92f9dfe4d5de6fcef73b6a Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.850647 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9bf9\" (UniqueName: \"kubernetes.io/projected/1ba66d7a-1fb5-4149-883b-b19428e7c2cb-kube-api-access-k9bf9\") pod \"apiserver-7bbb656c7d-5pgn9\" (UID: \"1ba66d7a-1fb5-4149-883b-b19428e7c2cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.853393 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.860253 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.865430 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.867993 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chm9m\" (UniqueName: \"kubernetes.io/projected/71229745-aa94-4aa5-90c8-95d65fcca563-kube-api-access-chm9m\") pod \"control-plane-machine-set-operator-78cbb6b69f-v42cq\" (UID: \"71229745-aa94-4aa5-90c8-95d65fcca563\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.887817 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.897018 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4km5q\" (UniqueName: \"kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q\") pod \"route-controller-manager-6576b87f9c-fx652\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.897287 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.910693 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.922264 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.927428 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.927662 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdmrc\" (UniqueName: \"kubernetes.io/projected/ca0da215-5c31-4c91-939c-77e95ab4a568-kube-api-access-cdmrc\") pod \"router-default-5444994796-t668l\" (UID: \"ca0da215-5c31-4c91-939c-77e95ab4a568\") " pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.931068 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.931212 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.431193753 +0000 UTC m=+137.296404863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.931429 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:52 crc kubenswrapper[4733]: E1206 05:45:52.931755 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.431747233 +0000 UTC m=+137.296958343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.953798 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8snn\" (UniqueName: \"kubernetes.io/projected/34ef48cd-203c-41e4-99ce-64d24203d4c0-kube-api-access-s8snn\") pod \"dns-operator-744455d44c-ntght\" (UID: \"34ef48cd-203c-41e4-99ce-64d24203d4c0\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.974796 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgv8w\" (UniqueName: \"kubernetes.io/projected/3641143a-a3c8-4ff3-8f5f-783d428411ae-kube-api-access-jgv8w\") pod \"etcd-operator-b45778765-m2w78\" (UID: \"3641143a-a3c8-4ff3-8f5f-783d428411ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.985172 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:52 crc kubenswrapper[4733]: I1206 05:45:52.992701 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.003656 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6sx\" (UniqueName: \"kubernetes.io/projected/b396c172-2cca-48e0-85bd-192fe03d0f93-kube-api-access-ht6sx\") pod \"machine-config-server-cdfjx\" (UID: \"b396c172-2cca-48e0-85bd-192fe03d0f93\") " pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.009589 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqwb\" (UniqueName: \"kubernetes.io/projected/aca17857-561a-4f8c-b778-dac5aec3f04f-kube-api-access-rgqwb\") pod \"olm-operator-6b444d44fb-mtlfg\" (UID: \"aca17857-561a-4f8c-b778-dac5aec3f04f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.022160 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.026250 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqlv\" (UniqueName: \"kubernetes.io/projected/36ff56af-0a08-47c1-acc9-699aad3cd439-kube-api-access-rkqlv\") pod \"multus-admission-controller-857f4d67dd-t8xrv\" (UID: \"36ff56af-0a08-47c1-acc9-699aad3cd439\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.032455 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.033291 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.033859 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.533830064 +0000 UTC m=+137.399041175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.038451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" event={"ID":"eb1a5702-c3b2-4f2f-997d-585725f89e4a","Type":"ContainerStarted","Data":"2076acacec9456d56d63b236ca051409fba1f5a52542d570ef5a5bbccb349b99"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.038502 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" event={"ID":"eb1a5702-c3b2-4f2f-997d-585725f89e4a","Type":"ContainerStarted","Data":"d03f99ad5dab834103393ced19d5553dadc0a281c7f7ea26bb51ac9884026011"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.051630 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc6wv\" (UniqueName: \"kubernetes.io/projected/b914e7ba-8f78-466b-81c1-7e4bca4c4f56-kube-api-access-nc6wv\") pod \"dns-default-rlgkt\" (UID: \"b914e7ba-8f78-466b-81c1-7e4bca4c4f56\") " pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.055325 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.055752 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" event={"ID":"4416c700-40b6-4e24-b003-6e503a8c8533","Type":"ContainerStarted","Data":"a5f17c507c99ffc97bfd6190737a49e5f95ddef52e213516942c0320c2379f20"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.064007 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.069533 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" event={"ID":"549f5614-6b98-454e-970a-e623fd4ec9a8","Type":"ContainerStarted","Data":"02303a77d7796685a0a830360ecbc1dd2d652f93d33b05adfd088b8a3c38ee4f"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.069570 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" event={"ID":"549f5614-6b98-454e-970a-e623fd4ec9a8","Type":"ContainerStarted","Data":"dfe032a0f0c187841b1db69668451717ca311700c105c6eee8c3d20ce6317bda"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.070290 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld49h\" (UniqueName: \"kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h\") pod \"collect-profiles-29416665-87xkl\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.074769 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" event={"ID":"1646b426-e8f8-48bd-83b1-919eb5c8466f","Type":"ContainerStarted","Data":"a3d4c33dff5971dc4e1a290f916a9827678cf2e2319ee56ba3e75dc2d952c26a"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.080942 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.083485 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.084002 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" event={"ID":"a420a769-7492-4f06-ad4a-f4126e155429","Type":"ContainerStarted","Data":"ace3b210273da1975d5b8d4a05d4b5da930e68000f92f9dfe4d5de6fcef73b6a"} Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.093994 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.102499 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz52\" (UniqueName: \"kubernetes.io/projected/e22cc297-d62d-4a07-8131-062668e5b69a-kube-api-access-ctz52\") pod \"package-server-manager-789f6589d5-52jkz\" (UID: \"e22cc297-d62d-4a07-8131-062668e5b69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.107610 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpq6\" (UniqueName: \"kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6\") pod \"marketplace-operator-79b997595-fbnvh\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.111973 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.124603 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.129163 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.131358 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78cv\" (UniqueName: \"kubernetes.io/projected/386f85c9-e984-405c-af7a-225fb5bcfcaf-kube-api-access-n78cv\") pod \"csi-hostpathplugin-bbq6p\" (UID: \"386f85c9-e984-405c-af7a-225fb5bcfcaf\") " pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.137886 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.142131 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.642110965 +0000 UTC m=+137.507322076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.142773 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.144347 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rtwt"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.149628 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdmd\" (UniqueName: \"kubernetes.io/projected/8b7a185f-09db-4aa3-9ece-15e0b7a21098-kube-api-access-njdmd\") pod \"catalog-operator-68c6474976-dlzz7\" (UID: \"8b7a185f-09db-4aa3-9ece-15e0b7a21098\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.168103 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tdz\" (UniqueName: \"kubernetes.io/projected/3d6b28d7-efae-441a-a3bd-2fb8a1f1561c-kube-api-access-k9tdz\") pod \"ingress-canary-mjw9v\" (UID: \"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c\") " pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:53 crc kubenswrapper[4733]: W1206 05:45:53.169352 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35279b0_48f4_49e5_af04_c471474695f1.slice/crio-9338d8e98c34b40c2045cd6c1a9154c625bca452549d88f399d279285dccaa1a WatchSource:0}: Error finding container 9338d8e98c34b40c2045cd6c1a9154c625bca452549d88f399d279285dccaa1a: Status 404 returned error can't find the container with id 9338d8e98c34b40c2045cd6c1a9154c625bca452549d88f399d279285dccaa1a Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.176074 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.178457 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.183852 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.190793 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.194187 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qnbt\" (UniqueName: \"kubernetes.io/projected/4e5e6ad5-8339-4e8c-b371-5d07e7aadb38-kube-api-access-4qnbt\") pod \"packageserver-d55dfcdfc-6kr5j\" (UID: \"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.195181 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cdfjx" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.196419 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mjw9v" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.209362 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xqz8g\" (UID: \"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:53 crc kubenswrapper[4733]: W1206 05:45:53.217467 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883d952f_b02d_4b4c_b686_f77b921c77ae.slice/crio-3c590877d014c4f4f3d0f8d7e9b1c53fe590e65a3629f1631524802d803cea48 WatchSource:0}: Error finding container 3c590877d014c4f4f3d0f8d7e9b1c53fe590e65a3629f1631524802d803cea48: Status 404 returned error can't find the container with id 3c590877d014c4f4f3d0f8d7e9b1c53fe590e65a3629f1631524802d803cea48 Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.231907 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.242387 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.243226 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.743208455 +0000 UTC m=+137.608419566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.250375 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbvfs\" (UniqueName: \"kubernetes.io/projected/8af9e6ec-1eee-4d33-a4c3-efb55657ecf9-kube-api-access-wbvfs\") pod \"service-ca-9c57cc56f-tdjqn\" (UID: \"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9\") " pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.270440 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.272086 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.326978 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.344973 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.350267 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.350318 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6"] Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.356631 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.856596706 +0000 UTC m=+137.721807817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.364974 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.418133 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.435027 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.440622 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.447283 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.463331 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.463939 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:53.963921309 +0000 UTC m=+137.829132420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.524108 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.545761 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7tsr2"] Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.564680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.564992 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.064978383 +0000 UTC m=+137.930189494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.668582 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.668898 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.168871767 +0000 UTC m=+138.034082879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.669801 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.671068 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.171044571 +0000 UTC m=+138.036255683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.770979 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.771762 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.271724106 +0000 UTC m=+138.136935216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.771945 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.772479 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.272470538 +0000 UTC m=+138.137681649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.875787 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.876584 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.37656491 +0000 UTC m=+138.241776012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:53 crc kubenswrapper[4733]: I1206 05:45:53.981104 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:53 crc kubenswrapper[4733]: E1206 05:45:53.981442 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.481428889 +0000 UTC m=+138.346640000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.085899 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.086651 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.586627157 +0000 UTC m=+138.451838268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.086926 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.087226 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.587214691 +0000 UTC m=+138.452425801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.100419 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" event={"ID":"92d3e212-f442-49de-af58-7a4efc70a68f","Type":"ContainerStarted","Data":"cf141981a7428fd1cdc65f81ab531f0f40fd6314b8ab7d94aa1710911155eab8"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.113006 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" event={"ID":"497bc6aa-1f54-4fd3-b2af-eb564609b96e","Type":"ContainerStarted","Data":"0dc35c82fcce07a987944a4e67cbb712e5dc2ba9e903ae37326422cd4f3ab421"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.113047 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" event={"ID":"497bc6aa-1f54-4fd3-b2af-eb564609b96e","Type":"ContainerStarted","Data":"a63bbd0707257b8ff9d4b0ad3dea95c2c6aa320f635e128f63cdf6d23d84722c"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.119883 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" event={"ID":"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56","Type":"ContainerStarted","Data":"682cd9f14a30e89e5cb773940373d909c3d6124524c3d988fe8c388eb8686a1d"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.119924 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" event={"ID":"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56","Type":"ContainerStarted","Data":"b5c802b604889ac93b6acaa95f2f20646fd63925b4f14bec88368bd22b6a4295"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.128025 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" event={"ID":"612ba8e8-eee6-4b30-967b-d838fb05147e","Type":"ContainerStarted","Data":"6a8cba5a99bfb04b4c7babf7d40d2e374ad6d9527bedfed368d27991a577a069"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.128066 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" event={"ID":"612ba8e8-eee6-4b30-967b-d838fb05147e","Type":"ContainerStarted","Data":"4b3831c4c4df80df7b5b7971a213c6c2c3a867c66373fe11126543c7958fc1b3"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.132451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" event={"ID":"68e5cd26-b9f4-48c0-a6e1-53d27816aa67","Type":"ContainerStarted","Data":"24ae19bb68ffc61cd3d30fdc43faccee62933f606b122118e1f0d099b1fd4ff2"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.132487 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" event={"ID":"68e5cd26-b9f4-48c0-a6e1-53d27816aa67","Type":"ContainerStarted","Data":"e8e6a53d3ec4e88819e86871c0993f5835d6e4e40f4af538aec2fcbff93714a4"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.146970 4733 generic.go:334] "Generic (PLEG): container finished" podID="a35279b0-48f4-49e5-af04-c471474695f1" containerID="e7b3db8ac2ae4f488335a3f67e3c88b9c4ce1af97c89be3dcbab4a4f6abbe6bf" exitCode=0 Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.147034 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" event={"ID":"a35279b0-48f4-49e5-af04-c471474695f1","Type":"ContainerDied","Data":"e7b3db8ac2ae4f488335a3f67e3c88b9c4ce1af97c89be3dcbab4a4f6abbe6bf"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.147071 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" event={"ID":"a35279b0-48f4-49e5-af04-c471474695f1","Type":"ContainerStarted","Data":"9338d8e98c34b40c2045cd6c1a9154c625bca452549d88f399d279285dccaa1a"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.157495 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" event={"ID":"a420a769-7492-4f06-ad4a-f4126e155429","Type":"ContainerStarted","Data":"3769c175ec46c30695f4372150cc2973fae51c73289e1b3e22c860d55b5975bf"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.158991 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" event={"ID":"d1690f8e-c151-4c06-b52e-b51e769af54f","Type":"ContainerStarted","Data":"41e1162aec6f1be075357002cafd4b476820e4b79d31d51b7efe746bb3d5fba7"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.160747 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tbqkj" event={"ID":"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832","Type":"ContainerStarted","Data":"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.160773 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tbqkj" event={"ID":"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832","Type":"ContainerStarted","Data":"5bdfd9c5bfd805594e5003a7d4a847af823720ca7c0543d466a08d1270d93b33"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.164498 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" event={"ID":"bb0fb709-5a66-42a8-aad4-c405502ce542","Type":"ContainerStarted","Data":"f2df924a5b34baec6c589c50a5142bbdd939dc160b2c1446d9c8885693fe6924"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.165516 4733 generic.go:334] "Generic (PLEG): container finished" podID="4416c700-40b6-4e24-b003-6e503a8c8533" containerID="0ebd1e8c62b2db206b6944d597a458a8376794fa47990c92acc833c147e06642" exitCode=0 Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.165571 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" event={"ID":"4416c700-40b6-4e24-b003-6e503a8c8533","Type":"ContainerDied","Data":"0ebd1e8c62b2db206b6944d597a458a8376794fa47990c92acc833c147e06642"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.171247 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" event={"ID":"1a84775b-e1f9-4699-af95-16a181527cf2","Type":"ContainerStarted","Data":"924382a0083e0bb68bc3f788f6fa07a2e5cd09ad5ff43d056579e680938a7d0d"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.171279 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" event={"ID":"1a84775b-e1f9-4699-af95-16a181527cf2","Type":"ContainerStarted","Data":"a87bd152793d674d936f6dcb3fe052f33dbc4127c235244725c295931ddfda6f"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.173369 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rtwt" event={"ID":"6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d","Type":"ContainerStarted","Data":"611252822b858e8f09908b018d2c65692e9e9dc227559262f39e1f6ce3f2fcbd"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.173425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rtwt" event={"ID":"6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d","Type":"ContainerStarted","Data":"b65392158268458d00a1857787986b83942bad02987b9aee19cea4c5baf5c110"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.173745 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.174833 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rtwt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.174891 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rtwt" podUID="6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.176289 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cdfjx" event={"ID":"b396c172-2cca-48e0-85bd-192fe03d0f93","Type":"ContainerStarted","Data":"cd06af8035ccccf59eac74a483f40ae3e73c1f50439fd808147d216ea9626766"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.176354 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cdfjx" event={"ID":"b396c172-2cca-48e0-85bd-192fe03d0f93","Type":"ContainerStarted","Data":"24fc2062593a959786aae99df7ff8ad1cfd47262e422b66b22a13e43bde7a0fe"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.180379 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" event={"ID":"883d952f-b02d-4b4c-b686-f77b921c77ae","Type":"ContainerStarted","Data":"a8943448cdb3bb9fd5a6a449b6c94d5584a179169c2695a5332d9ce80f1f0940"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.180612 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" event={"ID":"883d952f-b02d-4b4c-b686-f77b921c77ae","Type":"ContainerStarted","Data":"3c590877d014c4f4f3d0f8d7e9b1c53fe590e65a3629f1631524802d803cea48"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.187606 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.188475 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.688459117 +0000 UTC m=+138.553670228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.195877 4733 generic.go:334] "Generic (PLEG): container finished" podID="1ba66d7a-1fb5-4149-883b-b19428e7c2cb" containerID="d1e4b493f1cd73487ddcfef3f5eefe0ce130dad7dfdd523fee264f13b38c5d8f" exitCode=0 Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.197342 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" event={"ID":"1ba66d7a-1fb5-4149-883b-b19428e7c2cb","Type":"ContainerDied","Data":"d1e4b493f1cd73487ddcfef3f5eefe0ce130dad7dfdd523fee264f13b38c5d8f"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.197410 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" event={"ID":"1ba66d7a-1fb5-4149-883b-b19428e7c2cb","Type":"ContainerStarted","Data":"acf16385715550574be67cf47757e42f7fdc4137dd8bfd082902c08be1d812d2"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.206360 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t668l" event={"ID":"ca0da215-5c31-4c91-939c-77e95ab4a568","Type":"ContainerStarted","Data":"96ebb522d7642e570f227802eae5b907a5b3ca6a93d431f963083071a6113bfe"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.206385 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t668l" event={"ID":"ca0da215-5c31-4c91-939c-77e95ab4a568","Type":"ContainerStarted","Data":"cb46ce457681f3293fb425de18d6b07bec5441cb733f7fb46acd70d1b3cf3f8b"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.216387 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntght"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.232349 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.256562 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.266086 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.266743 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6l9dt"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.273476 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" event={"ID":"1646b426-e8f8-48bd-83b1-919eb5c8466f","Type":"ContainerStarted","Data":"beb22ab4f1627258bcd4808dead7e1085dcfa6be7268d43ce7c89bc83ab275e2"} Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.282491 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.292767 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.307599 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.807579772 +0000 UTC m=+138.672790883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.398376 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.399860 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:54.89983952 +0000 UTC m=+138.765050631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.422130 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbnvh"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.425482 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-m2w78"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.429884 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bbq6p"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.438214 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.446429 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.448580 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.452541 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t8xrv"] Dec 06 05:45:54 crc kubenswrapper[4733]: W1206 05:45:54.475629 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36ff56af_0a08_47c1_acc9_699aad3cd439.slice/crio-ca40e7f48255e4837c2886c27299cd94025968880033d105ceb08eac76f47539 WatchSource:0}: Error finding container ca40e7f48255e4837c2886c27299cd94025968880033d105ceb08eac76f47539: Status 404 returned error can't find the container with id ca40e7f48255e4837c2886c27299cd94025968880033d105ceb08eac76f47539 Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.500324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.500713 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.000698661 +0000 UTC m=+138.865909772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.546218 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mjw9v"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.559551 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rlgkt"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.562019 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.603257 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.603759 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.103735726 +0000 UTC m=+138.968946837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.609608 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.629766 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kb6cg" podStartSLOduration=119.62974924 podStartE2EDuration="1m59.62974924s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.628784156 +0000 UTC m=+138.493995258" watchObservedRunningTime="2025-12-06 05:45:54.62974924 +0000 UTC m=+138.494960352" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.704122 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5rtwt" podStartSLOduration=120.704103414 podStartE2EDuration="2m0.704103414s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.701377841 +0000 UTC m=+138.566588973" watchObservedRunningTime="2025-12-06 05:45:54.704103414 +0000 UTC m=+138.569314525" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.707951 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.708344 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.208330218 +0000 UTC m=+139.073541330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.752403 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cdfjx" podStartSLOduration=4.752388539 podStartE2EDuration="4.752388539s" podCreationTimestamp="2025-12-06 05:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.751048901 +0000 UTC m=+138.616260013" watchObservedRunningTime="2025-12-06 05:45:54.752388539 +0000 UTC m=+138.617599650" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.793329 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drkhf" podStartSLOduration=119.793294776 podStartE2EDuration="1m59.793294776s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.787943026 +0000 UTC m=+138.653154137" watchObservedRunningTime="2025-12-06 05:45:54.793294776 +0000 UTC m=+138.658505907" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.808398 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tdjqn"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.809755 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.811883 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.311843118 +0000 UTC m=+139.177054229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.824774 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.833879 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.836319 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t668l" podStartSLOduration=119.836292702 podStartE2EDuration="1m59.836292702s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.823433893 +0000 UTC m=+138.688645014" watchObservedRunningTime="2025-12-06 05:45:54.836292702 +0000 UTC m=+138.701503813" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.836910 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7"] Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.907765 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kj8gq" podStartSLOduration=120.907742508 podStartE2EDuration="2m0.907742508s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.90744083 +0000 UTC m=+138.772651941" watchObservedRunningTime="2025-12-06 05:45:54.907742508 +0000 UTC m=+138.772953609" Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.913115 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:54 crc kubenswrapper[4733]: E1206 05:45:54.913736 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.413722387 +0000 UTC m=+139.278933498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:54 crc kubenswrapper[4733]: W1206 05:45:54.942824 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e5e6ad5_8339_4e8c_b371_5d07e7aadb38.slice/crio-7e581d8d97daa12689e60da95a78541fe7e5e969f6c3522cacd34397dff9f369 WatchSource:0}: Error finding container 7e581d8d97daa12689e60da95a78541fe7e5e969f6c3522cacd34397dff9f369: Status 404 returned error can't find the container with id 7e581d8d97daa12689e60da95a78541fe7e5e969f6c3522cacd34397dff9f369 Dec 06 05:45:54 crc kubenswrapper[4733]: I1206 05:45:54.984801 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4799x" podStartSLOduration=120.984783381 podStartE2EDuration="2m0.984783381s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:54.983817436 +0000 UTC m=+138.849028546" watchObservedRunningTime="2025-12-06 05:45:54.984783381 +0000 UTC m=+138.849994492" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.013855 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.014277 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.514261838 +0000 UTC m=+139.379472949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.039032 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tbqkj" podStartSLOduration=121.03835389 podStartE2EDuration="2m1.03835389s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.038048486 +0000 UTC m=+138.903259597" watchObservedRunningTime="2025-12-06 05:45:55.03835389 +0000 UTC m=+138.903565002" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.058859 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.069140 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:45:55 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:45:55 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:45:55 crc kubenswrapper[4733]: healthz check failed Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.069196 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.073913 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fm22" podStartSLOduration=121.073875816 podStartE2EDuration="2m1.073875816s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.072373813 +0000 UTC m=+138.937584914" watchObservedRunningTime="2025-12-06 05:45:55.073875816 +0000 UTC m=+138.939086927" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.118076 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.118861 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.61884586 +0000 UTC m=+139.484056971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.152658 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8mlc5" podStartSLOduration=121.15264105 podStartE2EDuration="2m1.15264105s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.151546804 +0000 UTC m=+139.016757905" watchObservedRunningTime="2025-12-06 05:45:55.15264105 +0000 UTC m=+139.017852162" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.237774 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.238325 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.738291456 +0000 UTC m=+139.603502567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.366220 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.366658 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.866645967 +0000 UTC m=+139.731857078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.375178 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rlgkt" event={"ID":"b914e7ba-8f78-466b-81c1-7e4bca4c4f56","Type":"ContainerStarted","Data":"b6fd8c2b078e0b5eb3e1056e864ab08199dcc9c7eaa1020bb325a27d739be345"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.406998 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" event={"ID":"92d3e212-f442-49de-af58-7a4efc70a68f","Type":"ContainerStarted","Data":"4bc9c3a9ff1bf18a0509e92abc86835a1399647c9607a5931185994f2dd1b4ac"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.407733 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.418207 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" event={"ID":"197ee618-405f-4f94-a618-da74488f0d23","Type":"ContainerStarted","Data":"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.418477 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" event={"ID":"197ee618-405f-4f94-a618-da74488f0d23","Type":"ContainerStarted","Data":"1179cb4b594b5ea8f8010b670e768c79f3a339a0fbd543b3caca6620a99711f2"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.418799 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.431207 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" event={"ID":"4416c700-40b6-4e24-b003-6e503a8c8533","Type":"ContainerStarted","Data":"c7647b474a602e521f47a7f982b4cbdec409eea51d8e1eb3d2534db467e3f89a"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.473116 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.473599 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:55.973580227 +0000 UTC m=+139.838791327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.486101 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" podStartSLOduration=121.486082977 podStartE2EDuration="2m1.486082977s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.460015361 +0000 UTC m=+139.325226472" watchObservedRunningTime="2025-12-06 05:45:55.486082977 +0000 UTC m=+139.351294078" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.487093 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" podStartSLOduration=120.487087565 podStartE2EDuration="2m0.487087565s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.48571213 +0000 UTC m=+139.350923241" watchObservedRunningTime="2025-12-06 05:45:55.487087565 +0000 UTC m=+139.352298666" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.515121 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mjw9v" event={"ID":"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c","Type":"ContainerStarted","Data":"3b0883cb07f097348edd0eca4a7873457ccb1064cb5966d72e2ca572e2c41117"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.541042 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" event={"ID":"ba35ba23-fd5c-47ca-bf00-dd4dcede4997","Type":"ContainerStarted","Data":"a7b066ab6e843254ed9f20454d732127fb9554443540c418d7f305c719fbd5a7"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.541090 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" event={"ID":"ba35ba23-fd5c-47ca-bf00-dd4dcede4997","Type":"ContainerStarted","Data":"0fe0fde1df1fac850f3050e425e8650d228e8491af0e8b2f53cfa459c79dc90a"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.562910 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" event={"ID":"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38","Type":"ContainerStarted","Data":"7e581d8d97daa12689e60da95a78541fe7e5e969f6c3522cacd34397dff9f369"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.571345 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" event={"ID":"71229745-aa94-4aa5-90c8-95d65fcca563","Type":"ContainerStarted","Data":"e1825d0882abd36b6e49d83062c5c98c15068ab63062b4bc288a08ee2fadb480"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.571378 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" event={"ID":"71229745-aa94-4aa5-90c8-95d65fcca563","Type":"ContainerStarted","Data":"645c025aa6acfb280d01792aa5040bf05ba99081e133dddbd66056ed7e8e9155"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.580114 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.582264 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.082251402 +0000 UTC m=+139.947462513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.599634 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" event={"ID":"8b7a185f-09db-4aa3-9ece-15e0b7a21098","Type":"ContainerStarted","Data":"7ea4963f77379f428349e322488daa8b3dad993d331c154007394c28371810e3"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.613257 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v42cq" podStartSLOduration=120.61323071 podStartE2EDuration="2m0.61323071s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.611743875 +0000 UTC m=+139.476954986" watchObservedRunningTime="2025-12-06 05:45:55.61323071 +0000 UTC m=+139.478441821" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.614968 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" event={"ID":"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d","Type":"ContainerStarted","Data":"fb913b3a8f913d075749cb904d311b7b33516fb1f9c63d6e3bee70551ca9717e"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.650521 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" event={"ID":"bb0fb709-5a66-42a8-aad4-c405502ce542","Type":"ContainerStarted","Data":"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.651461 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.652286 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7tsr2" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.708417 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.720260 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.726469 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.226440445 +0000 UTC m=+140.091651555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.730695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.733010 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.232991998 +0000 UTC m=+140.098203109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.733384 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" event={"ID":"1646b426-e8f8-48bd-83b1-919eb5c8466f","Type":"ContainerStarted","Data":"a60cfb840dae73183ece76d72ac559d0f844351c5467d2056f1394228074be30"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.757153 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" podStartSLOduration=121.757129026 podStartE2EDuration="2m1.757129026s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.704929152 +0000 UTC m=+139.570140263" watchObservedRunningTime="2025-12-06 05:45:55.757129026 +0000 UTC m=+139.622340137" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.769277 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" event={"ID":"d1690f8e-c151-4c06-b52e-b51e769af54f","Type":"ContainerStarted","Data":"0266ed3d00f4e52db35f307f393fc89fda7fa6d67bd88d8eab9b50890b24d1da"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.795764 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" event={"ID":"e22cc297-d62d-4a07-8131-062668e5b69a","Type":"ContainerStarted","Data":"f1a83d1f961cc14fd513dd23acb68544482c64fe3d6e3fa82f2f22a91685d3a0"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.795810 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" event={"ID":"e22cc297-d62d-4a07-8131-062668e5b69a","Type":"ContainerStarted","Data":"c0e822765371c008c90a77c742815bc5c8a35fba84a47de06a71abbffd19f567"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.814793 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" event={"ID":"d330f5cc-abab-4367-902f-97e41685007f","Type":"ContainerStarted","Data":"777c5dc86f84fb74b06dbed56cd1e1683a3b380234fff3fc500fd7d0afef9b04"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.814826 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" event={"ID":"d330f5cc-abab-4367-902f-97e41685007f","Type":"ContainerStarted","Data":"f03d1d3932677ffcd4e352840b3a26433e35662ade64f18362c6dcda9b596103"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.832296 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.833147 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.333131577 +0000 UTC m=+140.198342688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.856801 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-97hmm" podStartSLOduration=121.856784425 podStartE2EDuration="2m1.856784425s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.82734915 +0000 UTC m=+139.692560260" watchObservedRunningTime="2025-12-06 05:45:55.856784425 +0000 UTC m=+139.721995536" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.856939 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4cpb6" podStartSLOduration=120.856935749 podStartE2EDuration="2m0.856935749s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.855636617 +0000 UTC m=+139.720847719" watchObservedRunningTime="2025-12-06 05:45:55.856935749 +0000 UTC m=+139.722146850" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.858071 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" event={"ID":"24c32dd0-469b-4cd4-9468-92604fbec4a1","Type":"ContainerStarted","Data":"3e5dd57f8d64a38a54d9664a6951cfeedf4adfc8ebfe16aaede15555d39c33ef"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.858099 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" event={"ID":"24c32dd0-469b-4cd4-9468-92604fbec4a1","Type":"ContainerStarted","Data":"3fa8ff97178f7c9375269b1407c8a994866aa6f2b511dd4ab0bb99016f438407"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.895420 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7njc6" podStartSLOduration=120.895402501 podStartE2EDuration="2m0.895402501s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.894437447 +0000 UTC m=+139.759648557" watchObservedRunningTime="2025-12-06 05:45:55.895402501 +0000 UTC m=+139.760613601" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.912466 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" event={"ID":"1ed0db75-d198-42d8-ac27-91145205f42c","Type":"ContainerStarted","Data":"281433015c50230243bb4612f9bf0ac2baf19caea7ec2458143f93de4f94a72a"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.912501 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" event={"ID":"1ed0db75-d198-42d8-ac27-91145205f42c","Type":"ContainerStarted","Data":"f4eadaebd22b805003ad1c70423d05ca034e7ba345c315bf05d41b35d2bb1465"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.921579 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.937045 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:55 crc kubenswrapper[4733]: E1206 05:45:55.938650 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.438635559 +0000 UTC m=+140.303846670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.950961 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" event={"ID":"68e5cd26-b9f4-48c0-a6e1-53d27816aa67","Type":"ContainerStarted","Data":"3a60300b785bf9074f36d400323528038ccb126ce11dddd557bea36ec4fffc95"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.964379 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" podStartSLOduration=55.964368476 podStartE2EDuration="55.964368476s" podCreationTimestamp="2025-12-06 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:55.951450265 +0000 UTC m=+139.816661376" watchObservedRunningTime="2025-12-06 05:45:55.964368476 +0000 UTC m=+139.829579587" Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.964478 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" event={"ID":"34ef48cd-203c-41e4-99ce-64d24203d4c0","Type":"ContainerStarted","Data":"65c44e495283fa9a7a512f21811744f2426980f705809f901fb4593903a9c5a4"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.964516 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" event={"ID":"34ef48cd-203c-41e4-99ce-64d24203d4c0","Type":"ContainerStarted","Data":"a891ae9a6b4197229bde521f745ae9bb9d1ab2012706fea99affe1250cd9b28a"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.967003 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" event={"ID":"3cf2106f-7c73-4086-bc49-fa1b11f2e56f","Type":"ContainerStarted","Data":"05fde8fc82a8f83cfd8e76f124e6b7536ceac2288dd2ccfa0e4b58b6d32f2b95"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.985754 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" event={"ID":"1ba66d7a-1fb5-4149-883b-b19428e7c2cb","Type":"ContainerStarted","Data":"2ab4cbaba98eb73ffd8fe360a70c17da5eccdc62364219dc6851108f8e708829"} Dec 06 05:45:55 crc kubenswrapper[4733]: I1206 05:45:55.995401 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" event={"ID":"386f85c9-e984-405c-af7a-225fb5bcfcaf","Type":"ContainerStarted","Data":"34cb8c474aa7d79a0180703fb731c5f7ff3978a82eafb755f6ba9b238a8cfcfa"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.011943 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m8ljh" podStartSLOduration=121.011931775 podStartE2EDuration="2m1.011931775s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.009868076 +0000 UTC m=+139.875079187" watchObservedRunningTime="2025-12-06 05:45:56.011931775 +0000 UTC m=+139.877142886" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.044200 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.045291 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.545278292 +0000 UTC m=+140.410489403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.059834 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" event={"ID":"aca17857-561a-4f8c-b778-dac5aec3f04f","Type":"ContainerStarted","Data":"f951f69eccfdc104b6e85b3a869e50587eda5c75273132478cc37e0cd63fe400"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.059878 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" event={"ID":"aca17857-561a-4f8c-b778-dac5aec3f04f","Type":"ContainerStarted","Data":"1a8f45fd94cf52f27bade857d5f9960e2237c4e18b4270ebbe45b725b0c84126"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.060807 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.062347 4733 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mtlfg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.062381 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" podUID="aca17857-561a-4f8c-b778-dac5aec3f04f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.077474 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:45:56 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:45:56 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:45:56 crc kubenswrapper[4733]: healthz check failed Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.077507 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.077857 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" event={"ID":"76185470-be08-49f9-ab30-59314702bc08","Type":"ContainerStarted","Data":"ce8efb373b2a9d6229dc40f7cfec2885a85447019407c767a164b7240d6d62dc"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.077888 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" event={"ID":"76185470-be08-49f9-ab30-59314702bc08","Type":"ContainerStarted","Data":"61230c958dda2077fbe1385bc7e57e4f286bde59ae1dac7d2023649977e9a545"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.078528 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.079369 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fbnvh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.079393 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.080784 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" event={"ID":"b75a142e-dce5-4bf9-83da-25f46752b08f","Type":"ContainerStarted","Data":"ee1c9f9c43f4ed31fb308c5c16eabd577785ae3757d7cb4084167c1bdb0d4427"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.080814 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" event={"ID":"b75a142e-dce5-4bf9-83da-25f46752b08f","Type":"ContainerStarted","Data":"1df73f6e164656c56d07917cd99f59a936cdff8efa57ff0bb6ee75c73c61460c"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.083141 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" event={"ID":"a35279b0-48f4-49e5-af04-c471474695f1","Type":"ContainerStarted","Data":"1fcae3da5adb73f8b83bc396c0d38f3567ad3a30d819f6631ac28275ebb6bafc"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.084012 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.094821 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" podStartSLOduration=121.094802435 podStartE2EDuration="2m1.094802435s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.093925798 +0000 UTC m=+139.959136908" watchObservedRunningTime="2025-12-06 05:45:56.094802435 +0000 UTC m=+139.960013537" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.095962 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" podStartSLOduration=121.095954551 podStartE2EDuration="2m1.095954551s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.060699286 +0000 UTC m=+139.925910397" watchObservedRunningTime="2025-12-06 05:45:56.095954551 +0000 UTC m=+139.961165662" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.096238 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" event={"ID":"3641143a-a3c8-4ff3-8f5f-783d428411ae","Type":"ContainerStarted","Data":"325fee4c9bf848b4060db151fe4be8eba8d653090c9e156db2f8ef7b5514edd4"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.096268 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" event={"ID":"3641143a-a3c8-4ff3-8f5f-783d428411ae","Type":"ContainerStarted","Data":"f4bbc2cee644e19472004930facdb7c393116378cbc64321a3bfd115e53e81c0"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.100082 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" event={"ID":"1a84775b-e1f9-4699-af95-16a181527cf2","Type":"ContainerStarted","Data":"7b76dbb9c8c8c1f3ebce327cd3404dc857c7d25541def703d47ecf721a8c4f49"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.141471 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" podStartSLOduration=122.141457537 podStartE2EDuration="2m2.141457537s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.140074998 +0000 UTC m=+140.005286109" watchObservedRunningTime="2025-12-06 05:45:56.141457537 +0000 UTC m=+140.006668647" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.148518 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" event={"ID":"1fbc70fe-f193-4a2e-9a9f-2981b6c72a56","Type":"ContainerStarted","Data":"2e74fc85239a7215e8bb16d3abc2e67ecb203c5e534faae849423749a735d1bb"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.151783 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.153116 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.653096273 +0000 UTC m=+140.518307384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.201805 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" podStartSLOduration=122.201792681 podStartE2EDuration="2m2.201792681s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.200554775 +0000 UTC m=+140.065765885" watchObservedRunningTime="2025-12-06 05:45:56.201792681 +0000 UTC m=+140.067003792" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.202867 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" podStartSLOduration=121.202860768 podStartE2EDuration="2m1.202860768s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.161872918 +0000 UTC m=+140.027084030" watchObservedRunningTime="2025-12-06 05:45:56.202860768 +0000 UTC m=+140.068071879" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.228279 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" event={"ID":"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9","Type":"ContainerStarted","Data":"b6f674d204f7919d1549862c07f829e4b1a33289926380962b657982f67a4931"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.252953 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.254131 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.754090919 +0000 UTC m=+140.619302019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.262585 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" event={"ID":"36ff56af-0a08-47c1-acc9-699aad3cd439","Type":"ContainerStarted","Data":"ca40e7f48255e4837c2886c27299cd94025968880033d105ceb08eac76f47539"} Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.265852 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rtwt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.265978 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rtwt" podUID="6b7ac5ac-4296-4eb6-8eeb-f5978c268f2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.285855 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfdvl" podStartSLOduration=121.285838991 podStartE2EDuration="2m1.285838991s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.284108318 +0000 UTC m=+140.149319429" watchObservedRunningTime="2025-12-06 05:45:56.285838991 +0000 UTC m=+140.151050102" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.311703 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-x4xrx" podStartSLOduration=121.311687765 podStartE2EDuration="2m1.311687765s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.311445009 +0000 UTC m=+140.176656121" watchObservedRunningTime="2025-12-06 05:45:56.311687765 +0000 UTC m=+140.176898876" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.356288 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.358439 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.858423789 +0000 UTC m=+140.723634900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.361787 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-m2w78" podStartSLOduration=122.361774327 podStartE2EDuration="2m2.361774327s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.360420441 +0000 UTC m=+140.225631553" watchObservedRunningTime="2025-12-06 05:45:56.361774327 +0000 UTC m=+140.226985438" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.391762 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" podStartSLOduration=121.391749548 podStartE2EDuration="2m1.391749548s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:56.39019201 +0000 UTC m=+140.255403120" watchObservedRunningTime="2025-12-06 05:45:56.391749548 +0000 UTC m=+140.256960659" Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.457884 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.458622 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:56.958597201 +0000 UTC m=+140.823808313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.560498 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.560876 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.060861274 +0000 UTC m=+140.926072385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.662809 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.663397 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.163378563 +0000 UTC m=+141.028589673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.766257 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.769352 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.269334783 +0000 UTC m=+141.134545885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.870898 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.871586 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.371570022 +0000 UTC m=+141.236781133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:56 crc kubenswrapper[4733]: I1206 05:45:56.972864 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:56 crc kubenswrapper[4733]: E1206 05:45:56.973187 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.473170717 +0000 UTC m=+141.338381827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.059576 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:45:57 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:45:57 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:45:57 crc kubenswrapper[4733]: healthz check failed Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.059653 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.073661 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.073818 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.573784347 +0000 UTC m=+141.438995457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.073937 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.074295 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.574288444 +0000 UTC m=+141.439499555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.174955 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.175124 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.675104826 +0000 UTC m=+141.540315936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.175231 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.175549 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.675531147 +0000 UTC m=+141.540742258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.275717 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.276101 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.776072561 +0000 UTC m=+141.641283671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.276369 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.276667 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.776655276 +0000 UTC m=+141.641866388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.310767 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" event={"ID":"e22cc297-d62d-4a07-8131-062668e5b69a","Type":"ContainerStarted","Data":"f5b4c65aac454235e58f989c87a1c738f0b4b1e2ba308931d89db83e0e36b076"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.311062 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.316227 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" event={"ID":"d330f5cc-abab-4367-902f-97e41685007f","Type":"ContainerStarted","Data":"c631b425db261b2021751e6a18de076ecb88459776fc4d146438ef7165c9bb73"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.326140 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" event={"ID":"8b7a185f-09db-4aa3-9ece-15e0b7a21098","Type":"ContainerStarted","Data":"ef1e65590c068b4f5ac042daa84f6554b12b16bb27bd01996660b09388619f28"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.327026 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.338223 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" podStartSLOduration=122.338197589 podStartE2EDuration="2m2.338197589s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.335342863 +0000 UTC m=+141.200553974" watchObservedRunningTime="2025-12-06 05:45:57.338197589 +0000 UTC m=+141.203408700" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.340554 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mjw9v" event={"ID":"3d6b28d7-efae-441a-a3bd-2fb8a1f1561c","Type":"ContainerStarted","Data":"6fdaa3380281430da5fdcc1aa52c6f3761aa3cee97661b35a8f5070594251b60"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.359591 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" event={"ID":"4e5e6ad5-8339-4e8c-b371-5d07e7aadb38","Type":"ContainerStarted","Data":"bc6321285c8c64e0d42abe8f98aa7dcc60014374ccb698925dd2c4d1c7669e4d"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.360112 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.377024 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.378012 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.878000401 +0000 UTC m=+141.743211513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.378786 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tdjqn" event={"ID":"8af9e6ec-1eee-4d33-a4c3-efb55657ecf9","Type":"ContainerStarted","Data":"85548128642e214a3c7e91fc334288fe239d883b45b1af95e159b07617e25bd8"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.401109 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6l9dt" podStartSLOduration=122.401091453 podStartE2EDuration="2m2.401091453s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.383652715 +0000 UTC m=+141.248863826" watchObservedRunningTime="2025-12-06 05:45:57.401091453 +0000 UTC m=+141.266302564" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.401272 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" event={"ID":"4416c700-40b6-4e24-b003-6e503a8c8533","Type":"ContainerStarted","Data":"30be7ac9c2307cb82aac5c4c871f175b99f7650d0a7b6cb830d3de269707692d"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.401890 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" podStartSLOduration=122.401886346 podStartE2EDuration="2m2.401886346s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.400945669 +0000 UTC m=+141.266156779" watchObservedRunningTime="2025-12-06 05:45:57.401886346 +0000 UTC m=+141.267097458" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.430908 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" podStartSLOduration=122.430897245 podStartE2EDuration="2m2.430897245s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.428564079 +0000 UTC m=+141.293775190" watchObservedRunningTime="2025-12-06 05:45:57.430897245 +0000 UTC m=+141.296108355" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.432484 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b8mkn" event={"ID":"b75a142e-dce5-4bf9-83da-25f46752b08f","Type":"ContainerStarted","Data":"b7d5e52812be1c114527e5afb0cf2eadf74bb34898bb2a90f72b3168f9abf2cf"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.454127 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" event={"ID":"5ae9085f-b1b1-4550-9edc-80ea6bd0ef9d","Type":"ContainerStarted","Data":"c61d2d02436288c76311a57fac47ec1719de3e57bd755c8ac5cb52603f9a37fa"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.463432 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" event={"ID":"36ff56af-0a08-47c1-acc9-699aad3cd439","Type":"ContainerStarted","Data":"403dec41ac4f8f21e1a02670d83b91db81b4580910ade02657ec7f5e0788d264"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.463469 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" event={"ID":"36ff56af-0a08-47c1-acc9-699aad3cd439","Type":"ContainerStarted","Data":"e85e05fe0b7eebf91f73ca7af5bdf4e80d4bffbc69761a93fccd8af3c6836036"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.470908 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" event={"ID":"386f85c9-e984-405c-af7a-225fb5bcfcaf","Type":"ContainerStarted","Data":"74659b598ffe4b19ba6cc0c218e58c57b7d26fc0076336775886e09e9954c1a4"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.470936 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" event={"ID":"386f85c9-e984-405c-af7a-225fb5bcfcaf","Type":"ContainerStarted","Data":"bf335b959486b766c453713820d62fb0c78389438b7aa8b1972d063cbf925e06"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.472194 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" event={"ID":"34ef48cd-203c-41e4-99ce-64d24203d4c0","Type":"ContainerStarted","Data":"34e4e8788f824bc15dba0310e43181cd4f36d47172bbc59b19762739dc4896fd"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.473805 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rlgkt" event={"ID":"b914e7ba-8f78-466b-81c1-7e4bca4c4f56","Type":"ContainerStarted","Data":"2d6edfaf75781b74108ee8520c6bc282f48621684509f7bf95c653d68a113186"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.473831 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rlgkt" event={"ID":"b914e7ba-8f78-466b-81c1-7e4bca4c4f56","Type":"ContainerStarted","Data":"279147d8670e05b9e1c4fd8efb110b5149f95cb03547ca2edc600870a7c7c6de"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.474177 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rlgkt" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.477683 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" event={"ID":"ba35ba23-fd5c-47ca-bf00-dd4dcede4997","Type":"ContainerStarted","Data":"a2699b54c31683e55b748ab867d33fc66e9c7538999f9553bb28438527fa616f"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.481123 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.481399 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:57.981388116 +0000 UTC m=+141.846599227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.482753 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" event={"ID":"3cf2106f-7c73-4086-bc49-fa1b11f2e56f","Type":"ContainerStarted","Data":"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4"} Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.482791 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.486703 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fbnvh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.486750 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.496187 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dlzz7" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.498544 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mjw9v" podStartSLOduration=7.498533201 podStartE2EDuration="7.498533201s" podCreationTimestamp="2025-12-06 05:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.448909109 +0000 UTC m=+141.314120220" watchObservedRunningTime="2025-12-06 05:45:57.498533201 +0000 UTC m=+141.363744312" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.498736 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" podStartSLOduration=123.498730731 podStartE2EDuration="2m3.498730731s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.483462735 +0000 UTC m=+141.348673846" watchObservedRunningTime="2025-12-06 05:45:57.498730731 +0000 UTC m=+141.363941843" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.499529 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dk7bw" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.511504 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mtlfg" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.536618 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.540290 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.552616 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rlgkt" podStartSLOduration=7.552293086 podStartE2EDuration="7.552293086s" podCreationTimestamp="2025-12-06 05:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.525626785 +0000 UTC m=+141.390837897" watchObservedRunningTime="2025-12-06 05:45:57.552293086 +0000 UTC m=+141.417504197" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.553123 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xqz8g" podStartSLOduration=122.553118467 podStartE2EDuration="2m2.553118467s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.546603122 +0000 UTC m=+141.411814232" watchObservedRunningTime="2025-12-06 05:45:57.553118467 +0000 UTC m=+141.418329578" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.587828 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.589369 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.089353013 +0000 UTC m=+141.954564124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.658280 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ntght" podStartSLOduration=123.658264617 podStartE2EDuration="2m3.658264617s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.589902866 +0000 UTC m=+141.455113977" watchObservedRunningTime="2025-12-06 05:45:57.658264617 +0000 UTC m=+141.523475748" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.672238 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" podStartSLOduration=123.672214777 podStartE2EDuration="2m3.672214777s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.641721332 +0000 UTC m=+141.506932443" watchObservedRunningTime="2025-12-06 05:45:57.672214777 +0000 UTC m=+141.537425887" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.703176 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.703615 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.203602031 +0000 UTC m=+142.068813142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.713599 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mgwdw" podStartSLOduration=123.713575357 podStartE2EDuration="2m3.713575357s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.704349005 +0000 UTC m=+141.569560116" watchObservedRunningTime="2025-12-06 05:45:57.713575357 +0000 UTC m=+141.578786457" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.716816 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kr5j" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.766383 4733 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hxt8v container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]log ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]etcd ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/generic-apiserver-start-informers ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/max-in-flight-filter ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 06 05:45:57 crc kubenswrapper[4733]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 06 05:45:57 crc kubenswrapper[4733]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/project.openshift.io-projectcache ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-startinformers ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 06 05:45:57 crc kubenswrapper[4733]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 06 05:45:57 crc kubenswrapper[4733]: livez check failed Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.766455 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" podUID="4416c700-40b6-4e24-b003-6e503a8c8533" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.806165 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.806707 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.30668882 +0000 UTC m=+142.171899931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.823996 4733 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.829553 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-t8xrv" podStartSLOduration=122.829521305 podStartE2EDuration="2m2.829521305s" podCreationTimestamp="2025-12-06 05:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:57.734543117 +0000 UTC m=+141.599754228" watchObservedRunningTime="2025-12-06 05:45:57.829521305 +0000 UTC m=+141.694732407" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.866251 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.866298 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.894471 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.908653 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:57 crc kubenswrapper[4733]: E1206 05:45:57.909026 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.409012063 +0000 UTC m=+142.274223164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:57 crc kubenswrapper[4733]: I1206 05:45:57.920086 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.009506 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.009682 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.50965607 +0000 UTC m=+142.374867182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.010114 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.010473 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.510458638 +0000 UTC m=+142.375669750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.064935 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:45:58 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:45:58 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:45:58 crc kubenswrapper[4733]: healthz check failed Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.064991 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.111661 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.111848 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.611824012 +0000 UTC m=+142.477035122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.111945 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.112288 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.612279448 +0000 UTC m=+142.477490559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.213551 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.213720 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.713694204 +0000 UTC m=+142.578905314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.214007 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.214323 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.714296274 +0000 UTC m=+142.579507386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.315436 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.315601 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.815582389 +0000 UTC m=+142.680793500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.315688 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.315980 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.815971591 +0000 UTC m=+142.681182702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.416693 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.416884 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.916859997 +0000 UTC m=+142.782071109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.417118 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.417517 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:58.917507263 +0000 UTC m=+142.782718374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.493489 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" event={"ID":"386f85c9-e984-405c-af7a-225fb5bcfcaf","Type":"ContainerStarted","Data":"ba439fae72968486ac485eac1c9279d0584758f54523f8283a0e92a9d4b19780"} Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.493547 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" event={"ID":"386f85c9-e984-405c-af7a-225fb5bcfcaf","Type":"ContainerStarted","Data":"3327c676c914f4fc84acb7fe81d0176f23f5a824b15f7cc7a2debafc2527c2e9"} Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.504708 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5pgn9" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.505143 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.518327 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.523975 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 05:45:59.0239443 +0000 UTC m=+142.889155410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.537877 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bbq6p" podStartSLOduration=8.537858672 podStartE2EDuration="8.537858672s" podCreationTimestamp="2025-12-06 05:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:58.537265517 +0000 UTC m=+142.402476628" watchObservedRunningTime="2025-12-06 05:45:58.537858672 +0000 UTC m=+142.403069783" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.552209 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.553102 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.555386 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.557921 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.586068 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.616429 4733 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-06T05:45:57.824022329Z","Handler":null,"Name":""} Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.620956 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.621030 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.621067 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: E1206 05:45:58.621380 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 05:45:59.121368284 +0000 UTC m=+142.986579395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7rlhm" (UID: "6e073151-939a-4209-8cd7-39116b0165f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.674672 4733 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.674720 4733 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.721868 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.722659 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.722703 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.722825 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.744672 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jfh8w"] Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.745759 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.747086 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.752835 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.764709 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jfh8w"] Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.794472 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.823353 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.823504 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.823626 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r5k\" (UniqueName: \"kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.823833 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.834277 4733 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.834335 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.856726 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7rlhm\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.882218 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.925175 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.925255 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.925293 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r5k\" (UniqueName: \"kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.925660 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.925740 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.934580 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lblzc"] Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.935422 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.939188 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.940079 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r5k\" (UniqueName: \"kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k\") pod \"community-operators-jfh8w\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:58 crc kubenswrapper[4733]: I1206 05:45:58.955700 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lblzc"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.026135 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.026258 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.026317 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9279\" (UniqueName: \"kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.043950 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.064044 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:45:59 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:45:59 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:45:59 crc kubenswrapper[4733]: healthz check failed Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.064110 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.064248 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.066295 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:45:59 crc kubenswrapper[4733]: W1206 05:45:59.072569 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf72548c2_5259_4969_890c_89608e1a627d.slice/crio-6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94 WatchSource:0}: Error finding container 6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94: Status 404 returned error can't find the container with id 6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94 Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.128721 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.133348 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.133440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9279\" (UniqueName: \"kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.134165 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.134401 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.138010 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.138985 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.151352 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.153550 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9279\" (UniqueName: \"kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279\") pod \"certified-operators-lblzc\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.234586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.234780 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.235376 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvh9j\" (UniqueName: \"kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.279107 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jfh8w"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.290211 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:45:59 crc kubenswrapper[4733]: W1206 05:45:59.303690 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e51a04_4769_45b0_87b8_7292977ec73b.slice/crio-709fc8cde417e90f16bfe3614835fd59368fb525a520a6bb2d0b1b1c1ba69892 WatchSource:0}: Error finding container 709fc8cde417e90f16bfe3614835fd59368fb525a520a6bb2d0b1b1c1ba69892: Status 404 returned error can't find the container with id 709fc8cde417e90f16bfe3614835fd59368fb525a520a6bb2d0b1b1c1ba69892 Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.339509 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.339618 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.339748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvh9j\" (UniqueName: \"kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.341584 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.341865 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.346765 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.349643 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.357030 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.358242 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvh9j\" (UniqueName: \"kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j\") pod \"community-operators-5vqm6\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.441091 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.441560 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjpsg\" (UniqueName: \"kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.441596 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.467013 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.482329 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.525436 4733 generic.go:334] "Generic (PLEG): container finished" podID="71e51a04-4769-45b0-87b8-7292977ec73b" containerID="64eea5f7e3630d6db27474607b2444bdf7ec0a67300a87ca213dd53f06a3a160" exitCode=0 Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.525970 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerDied","Data":"64eea5f7e3630d6db27474607b2444bdf7ec0a67300a87ca213dd53f06a3a160"} Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.525999 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerStarted","Data":"709fc8cde417e90f16bfe3614835fd59368fb525a520a6bb2d0b1b1c1ba69892"} Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.533331 4733 generic.go:334] "Generic (PLEG): container finished" podID="1ed0db75-d198-42d8-ac27-91145205f42c" containerID="281433015c50230243bb4612f9bf0ac2baf19caea7ec2458143f93de4f94a72a" exitCode=0 Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.533381 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" event={"ID":"1ed0db75-d198-42d8-ac27-91145205f42c","Type":"ContainerDied","Data":"281433015c50230243bb4612f9bf0ac2baf19caea7ec2458143f93de4f94a72a"} Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.535153 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f72548c2-5259-4969-890c-89608e1a627d","Type":"ContainerStarted","Data":"1c46d9f9a1dc4a2d7d8ce47fbeba0aa84d4c21219dede6b98bc4524361734f60"} Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.535180 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f72548c2-5259-4969-890c-89608e1a627d","Type":"ContainerStarted","Data":"6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94"} Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.541369 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.542722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.542819 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjpsg\" (UniqueName: \"kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.542878 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.543640 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.543869 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.577602 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lblzc"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.580666 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjpsg\" (UniqueName: \"kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg\") pod \"certified-operators-qk6t8\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.676611 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.852456 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.852433343 podStartE2EDuration="1.852433343s" podCreationTimestamp="2025-12-06 05:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:45:59.59746898 +0000 UTC m=+143.462680090" watchObservedRunningTime="2025-12-06 05:45:59.852433343 +0000 UTC m=+143.717644454" Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.852611 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:45:59 crc kubenswrapper[4733]: I1206 05:45:59.963206 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:45:59 crc kubenswrapper[4733]: W1206 05:45:59.969768 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c3721ae_c7d6_49d0_8488_e84f96e08faa.slice/crio-54de35c390f5f924dd237d13a53b91e20861deec429359bd4554d6ff56373268 WatchSource:0}: Error finding container 54de35c390f5f924dd237d13a53b91e20861deec429359bd4554d6ff56373268: Status 404 returned error can't find the container with id 54de35c390f5f924dd237d13a53b91e20861deec429359bd4554d6ff56373268 Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.058893 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:46:00 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:46:00 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:46:00 crc kubenswrapper[4733]: healthz check failed Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.058966 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.495532 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.545994 4733 generic.go:334] "Generic (PLEG): container finished" podID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerID="c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345" exitCode=0 Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.546091 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerDied","Data":"c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.546174 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerStarted","Data":"5ae99f3e548d1f60aa4f13443eb8f4ffd8b4170427bdb77774c2363952405a3a"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.550700 4733 generic.go:334] "Generic (PLEG): container finished" podID="f72548c2-5259-4969-890c-89608e1a627d" containerID="1c46d9f9a1dc4a2d7d8ce47fbeba0aa84d4c21219dede6b98bc4524361734f60" exitCode=0 Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.550785 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f72548c2-5259-4969-890c-89608e1a627d","Type":"ContainerDied","Data":"1c46d9f9a1dc4a2d7d8ce47fbeba0aa84d4c21219dede6b98bc4524361734f60"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.557077 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" event={"ID":"6e073151-939a-4209-8cd7-39116b0165f0","Type":"ContainerStarted","Data":"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.557107 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" event={"ID":"6e073151-939a-4209-8cd7-39116b0165f0","Type":"ContainerStarted","Data":"b3c48e833667e6d8679fd24078e942bc2036bb8847660c7821bd4ef37bcf2cb1"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.557421 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.558935 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerID="29fc6177faf3e1fdea9aa9761b1ec80081f7806be6fc060615acca600a2dd1dc" exitCode=0 Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.558997 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerDied","Data":"29fc6177faf3e1fdea9aa9761b1ec80081f7806be6fc060615acca600a2dd1dc"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.559015 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerStarted","Data":"d5079c758c9c0a1b0000c3217e8d6ad1459847f8480bd77d83682d52fc15683d"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.565334 4733 generic.go:334] "Generic (PLEG): container finished" podID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerID="408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd" exitCode=0 Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.565769 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerDied","Data":"408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.565836 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerStarted","Data":"54de35c390f5f924dd237d13a53b91e20861deec429359bd4554d6ff56373268"} Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.580396 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" podStartSLOduration=126.580375498 podStartE2EDuration="2m6.580375498s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:46:00.577524529 +0000 UTC m=+144.442735640" watchObservedRunningTime="2025-12-06 05:46:00.580375498 +0000 UTC m=+144.445586609" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.735933 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.775971 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume\") pod \"1ed0db75-d198-42d8-ac27-91145205f42c\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.776140 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld49h\" (UniqueName: \"kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h\") pod \"1ed0db75-d198-42d8-ac27-91145205f42c\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.776175 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume\") pod \"1ed0db75-d198-42d8-ac27-91145205f42c\" (UID: \"1ed0db75-d198-42d8-ac27-91145205f42c\") " Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.777479 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ed0db75-d198-42d8-ac27-91145205f42c" (UID: "1ed0db75-d198-42d8-ac27-91145205f42c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.781811 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h" (OuterVolumeSpecName: "kube-api-access-ld49h") pod "1ed0db75-d198-42d8-ac27-91145205f42c" (UID: "1ed0db75-d198-42d8-ac27-91145205f42c"). InnerVolumeSpecName "kube-api-access-ld49h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.781850 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ed0db75-d198-42d8-ac27-91145205f42c" (UID: "1ed0db75-d198-42d8-ac27-91145205f42c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.878460 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ed0db75-d198-42d8-ac27-91145205f42c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.878497 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld49h\" (UniqueName: \"kubernetes.io/projected/1ed0db75-d198-42d8-ac27-91145205f42c-kube-api-access-ld49h\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.878507 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ed0db75-d198-42d8-ac27-91145205f42c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.942364 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzdr7"] Dec 06 05:46:00 crc kubenswrapper[4733]: E1206 05:46:00.944509 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed0db75-d198-42d8-ac27-91145205f42c" containerName="collect-profiles" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.945199 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed0db75-d198-42d8-ac27-91145205f42c" containerName="collect-profiles" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.945432 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed0db75-d198-42d8-ac27-91145205f42c" containerName="collect-profiles" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.946663 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.949439 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.950427 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzdr7"] Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.980245 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.980470 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:00 crc kubenswrapper[4733]: I1206 05:46:00.980643 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7ft\" (UniqueName: \"kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.061913 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:46:01 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:46:01 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:46:01 crc kubenswrapper[4733]: healthz check failed Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.061966 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.084326 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.084820 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7ft\" (UniqueName: \"kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.084895 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.087557 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.089216 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.110348 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7ft\" (UniqueName: \"kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft\") pod \"redhat-marketplace-tzdr7\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.276226 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.339278 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.340582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.351537 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.395738 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nr9\" (UniqueName: \"kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.396034 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.396127 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.466791 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzdr7"] Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.497351 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.497511 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.497623 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nr9\" (UniqueName: \"kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.497855 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.498087 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.514244 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nr9\" (UniqueName: \"kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9\") pod \"redhat-marketplace-kqg5z\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.571680 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerStarted","Data":"ea0550ea68b5e6dcde33b35239b23e680a7bcb93d9274d0e8eb04d106478a0d8"} Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.583519 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.589381 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl" event={"ID":"1ed0db75-d198-42d8-ac27-91145205f42c","Type":"ContainerDied","Data":"f4eadaebd22b805003ad1c70423d05ca034e7ba345c315bf05d41b35d2bb1465"} Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.589405 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4eadaebd22b805003ad1c70423d05ca034e7ba345c315bf05d41b35d2bb1465" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.675250 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.794003 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.907043 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access\") pod \"f72548c2-5259-4969-890c-89608e1a627d\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.907195 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir\") pod \"f72548c2-5259-4969-890c-89608e1a627d\" (UID: \"f72548c2-5259-4969-890c-89608e1a627d\") " Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.907328 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f72548c2-5259-4969-890c-89608e1a627d" (UID: "f72548c2-5259-4969-890c-89608e1a627d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.907617 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f72548c2-5259-4969-890c-89608e1a627d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.912554 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f72548c2-5259-4969-890c-89608e1a627d" (UID: "f72548c2-5259-4969-890c-89608e1a627d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.935975 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvjmk"] Dec 06 05:46:01 crc kubenswrapper[4733]: E1206 05:46:01.936189 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72548c2-5259-4969-890c-89608e1a627d" containerName="pruner" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.936208 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72548c2-5259-4969-890c-89608e1a627d" containerName="pruner" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.936317 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72548c2-5259-4969-890c-89608e1a627d" containerName="pruner" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.936933 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.941975 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 05:46:01 crc kubenswrapper[4733]: I1206 05:46:01.949353 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvjmk"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.008671 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.008737 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.008866 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tmx\" (UniqueName: \"kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.008913 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f72548c2-5259-4969-890c-89608e1a627d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.059821 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:46:02 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:46:02 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:46:02 crc kubenswrapper[4733]: healthz check failed Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.059882 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.110449 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.110516 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.110610 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tmx\" (UniqueName: \"kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.110960 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.110993 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.117874 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.125197 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tmx\" (UniqueName: \"kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx\") pod \"redhat-operators-bvjmk\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.258266 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.340453 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.341386 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.358478 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.414921 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.415032 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.415073 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.415135 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.415183 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfgfq\" (UniqueName: \"kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.416187 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.418821 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.516292 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfgfq\" (UniqueName: \"kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.516431 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.516478 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.516518 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.516542 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.517506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.517819 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.520328 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.520806 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.541615 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfgfq\" (UniqueName: \"kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq\") pod \"redhat-operators-77dc2\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.541662 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.545106 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hxt8v" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.597846 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.600685 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.605385 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.649537 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f72548c2-5259-4969-890c-89608e1a627d","Type":"ContainerDied","Data":"6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94"} Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.649586 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4605a02bbb9444c76c28f412cac1bf7139866238caa09d7c219c7081059a94" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.649604 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.657909 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.660929 4733 generic.go:334] "Generic (PLEG): container finished" podID="755fa11c-904b-49e4-928c-da2935842c80" containerID="d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade" exitCode=0 Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.661211 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerDied","Data":"d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade"} Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.661289 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerStarted","Data":"98e2ff6c34c144a95b2b27c91608f22314fd1c24a3f230a1a09ce08eacb57a9a"} Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.663156 4733 generic.go:334] "Generic (PLEG): container finished" podID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerID="6f6dc328fdf6d56cdc5dfdada512ef0e2e95387ddd035d396b6ae5bc23f8af80" exitCode=0 Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.665199 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerDied","Data":"6f6dc328fdf6d56cdc5dfdada512ef0e2e95387ddd035d396b6ae5bc23f8af80"} Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.665795 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.665837 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.667674 4733 patch_prober.go:28] interesting pod/console-f9d7485db-tbqkj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.667714 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tbqkj" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.685450 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvjmk"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.725184 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5rtwt" Dec 06 05:46:02 crc kubenswrapper[4733]: W1206 05:46:02.753862 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938df03e_d0d5_4b93_9a31_061262420f18.slice/crio-07a1c791b6d356abda526945329581d4524e86d4900eea0983558c6e64756095 WatchSource:0}: Error finding container 07a1c791b6d356abda526945329581d4524e86d4900eea0983558c6e64756095: Status 404 returned error can't find the container with id 07a1c791b6d356abda526945329581d4524e86d4900eea0983558c6e64756095 Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.786241 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.787146 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.789537 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.789931 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.801290 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.824818 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.824855 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: W1206 05:46:02.860459 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9ef2dfffa84336a8396d825fc5fb3ed05c76b6276fd9733d863c05f00786ca7a WatchSource:0}: Error finding container 9ef2dfffa84336a8396d825fc5fb3ed05c76b6276fd9733d863c05f00786ca7a: Status 404 returned error can't find the container with id 9ef2dfffa84336a8396d825fc5fb3ed05c76b6276fd9733d863c05f00786ca7a Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.926956 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.927216 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.927609 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:02 crc kubenswrapper[4733]: I1206 05:46:02.948723 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.059554 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.064317 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:46:03 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:46:03 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:46:03 crc kubenswrapper[4733]: healthz check failed Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.064373 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:46:03 crc kubenswrapper[4733]: W1206 05:46:03.075109 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f3831091e3262082671cc6b833f4983a3d4a53a38ebae4936214094cc0a7d469 WatchSource:0}: Error finding container f3831091e3262082671cc6b833f4983a3d4a53a38ebae4936214094cc0a7d469: Status 404 returned error can't find the container with id f3831091e3262082671cc6b833f4983a3d4a53a38ebae4936214094cc0a7d469 Dec 06 05:46:03 crc kubenswrapper[4733]: W1206 05:46:03.113900 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-3e9c791d68e9066e265a31e296aabc33fa32028cff58a311d5ce674a83494956 WatchSource:0}: Error finding container 3e9c791d68e9066e265a31e296aabc33fa32028cff58a311d5ce674a83494956: Status 404 returned error can't find the container with id 3e9c791d68e9066e265a31e296aabc33fa32028cff58a311d5ce674a83494956 Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.115944 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.123908 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.367005 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.674527 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58095812-8308-43c6-a12c-305a33525986","Type":"ContainerStarted","Data":"51b3da9459c327bd35dea9e044124377064ce4fda112305a831bbaac5fc5cee0"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.682756 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fc69392b3d3e2654fd7ce7c41aa487dbe0d00626b3079e4f4fa6e6838a826f1c"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.682839 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f3831091e3262082671cc6b833f4983a3d4a53a38ebae4936214094cc0a7d469"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.700783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f843a16fb02fa8724509996915b22ae139a229a84a7b5a13f4f94fb933dabe73"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.700898 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3e9c791d68e9066e265a31e296aabc33fa32028cff58a311d5ce674a83494956"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.702451 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.707796 4733 generic.go:334] "Generic (PLEG): container finished" podID="938df03e-d0d5-4b93-9a31-061262420f18" containerID="a1aefe6069022502fe7dce9396c1bfa0706d1383349f3d35288077a28446b16c" exitCode=0 Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.707870 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerDied","Data":"a1aefe6069022502fe7dce9396c1bfa0706d1383349f3d35288077a28446b16c"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.707896 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerStarted","Data":"07a1c791b6d356abda526945329581d4524e86d4900eea0983558c6e64756095"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.711270 4733 generic.go:334] "Generic (PLEG): container finished" podID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerID="3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d" exitCode=0 Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.711357 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerDied","Data":"3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.711378 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerStarted","Data":"4a0df095e072d3d2629f8e26e2259b367b7ab70db27557cf213ef921f5d642b0"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.734234 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"313f7d0699f42d8c846ba7ec22bf9f3b87042e5a94d49b45736c0e5fd210b222"} Dec 06 05:46:03 crc kubenswrapper[4733]: I1206 05:46:03.734375 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9ef2dfffa84336a8396d825fc5fb3ed05c76b6276fd9733d863c05f00786ca7a"} Dec 06 05:46:04 crc kubenswrapper[4733]: I1206 05:46:04.059717 4733 patch_prober.go:28] interesting pod/router-default-5444994796-t668l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 05:46:04 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Dec 06 05:46:04 crc kubenswrapper[4733]: [+]process-running ok Dec 06 05:46:04 crc kubenswrapper[4733]: healthz check failed Dec 06 05:46:04 crc kubenswrapper[4733]: I1206 05:46:04.059801 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t668l" podUID="ca0da215-5c31-4c91-939c-77e95ab4a568" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 05:46:04 crc kubenswrapper[4733]: I1206 05:46:04.776486 4733 generic.go:334] "Generic (PLEG): container finished" podID="58095812-8308-43c6-a12c-305a33525986" containerID="4a0dbf4bf63039af067f3ebad8793d67905b3f4c218e7c39e41fc83eb914108c" exitCode=0 Dec 06 05:46:04 crc kubenswrapper[4733]: I1206 05:46:04.776598 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58095812-8308-43c6-a12c-305a33525986","Type":"ContainerDied","Data":"4a0dbf4bf63039af067f3ebad8793d67905b3f4c218e7c39e41fc83eb914108c"} Dec 06 05:46:05 crc kubenswrapper[4733]: I1206 05:46:05.058586 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:46:05 crc kubenswrapper[4733]: I1206 05:46:05.061992 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t668l" Dec 06 05:46:05 crc kubenswrapper[4733]: I1206 05:46:05.193606 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rlgkt" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.655652 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.697163 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir\") pod \"58095812-8308-43c6-a12c-305a33525986\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.697220 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access\") pod \"58095812-8308-43c6-a12c-305a33525986\" (UID: \"58095812-8308-43c6-a12c-305a33525986\") " Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.698182 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "58095812-8308-43c6-a12c-305a33525986" (UID: "58095812-8308-43c6-a12c-305a33525986"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.702877 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "58095812-8308-43c6-a12c-305a33525986" (UID: "58095812-8308-43c6-a12c-305a33525986"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.798507 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58095812-8308-43c6-a12c-305a33525986-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.798532 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58095812-8308-43c6-a12c-305a33525986-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.807754 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58095812-8308-43c6-a12c-305a33525986","Type":"ContainerDied","Data":"51b3da9459c327bd35dea9e044124377064ce4fda112305a831bbaac5fc5cee0"} Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.807788 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b3da9459c327bd35dea9e044124377064ce4fda112305a831bbaac5fc5cee0" Dec 06 05:46:07 crc kubenswrapper[4733]: I1206 05:46:07.807825 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 05:46:12 crc kubenswrapper[4733]: I1206 05:46:12.670619 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:46:12 crc kubenswrapper[4733]: I1206 05:46:12.674475 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:46:12 crc kubenswrapper[4733]: I1206 05:46:12.989320 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:46:12 crc kubenswrapper[4733]: I1206 05:46:12.989586 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:46:16 crc kubenswrapper[4733]: I1206 05:46:16.619827 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:46:16 crc kubenswrapper[4733]: I1206 05:46:16.628649 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8909c1-5ab7-4c3f-aba1-436c64849e8a-metrics-certs\") pod \"network-metrics-daemon-8fw28\" (UID: \"7e8909c1-5ab7-4c3f-aba1-436c64849e8a\") " pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:46:16 crc kubenswrapper[4733]: I1206 05:46:16.895189 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8fw28" Dec 06 05:46:19 crc kubenswrapper[4733]: I1206 05:46:19.049997 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:46:21 crc kubenswrapper[4733]: I1206 05:46:21.910359 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerStarted","Data":"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.110151 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8fw28"] Dec 06 05:46:22 crc kubenswrapper[4733]: W1206 05:46:22.124006 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8909c1_5ab7_4c3f_aba1_436c64849e8a.slice/crio-09b5bde34d89001017bc115acdb92250559639217946fba63a9f307a982d464d WatchSource:0}: Error finding container 09b5bde34d89001017bc115acdb92250559639217946fba63a9f307a982d464d: Status 404 returned error can't find the container with id 09b5bde34d89001017bc115acdb92250559639217946fba63a9f307a982d464d Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.924875 4733 generic.go:334] "Generic (PLEG): container finished" podID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerID="99b96d833959bc8fc53eb6cb831d9ca359e10ca3d0e954298ff3cddabe32af9d" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.924981 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerDied","Data":"99b96d833959bc8fc53eb6cb831d9ca359e10ca3d0e954298ff3cddabe32af9d"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.929539 4733 generic.go:334] "Generic (PLEG): container finished" podID="938df03e-d0d5-4b93-9a31-061262420f18" containerID="41ce4cd04f112951baf91e6df3377f73f481e6681d4e6b232af29bce708fad52" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.929642 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerDied","Data":"41ce4cd04f112951baf91e6df3377f73f481e6681d4e6b232af29bce708fad52"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.934125 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerID="24ef956cf799d8ecb458474e0015925506bdab2f4ca6b7088f4a851418c3bd40" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.934236 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerDied","Data":"24ef956cf799d8ecb458474e0015925506bdab2f4ca6b7088f4a851418c3bd40"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.939781 4733 generic.go:334] "Generic (PLEG): container finished" podID="71e51a04-4769-45b0-87b8-7292977ec73b" containerID="089490b3f20e36751dcc011def022641839f2dffca2ab773267a8c05afbf5344" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.939866 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerDied","Data":"089490b3f20e36751dcc011def022641839f2dffca2ab773267a8c05afbf5344"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.942653 4733 generic.go:334] "Generic (PLEG): container finished" podID="755fa11c-904b-49e4-928c-da2935842c80" containerID="1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.942747 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerDied","Data":"1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.953488 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8fw28" event={"ID":"7e8909c1-5ab7-4c3f-aba1-436c64849e8a","Type":"ContainerStarted","Data":"9521a832095fb1e67241b025fe5cbd96d12010ce46aeee6bee0617f0f3f852bf"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.953560 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8fw28" event={"ID":"7e8909c1-5ab7-4c3f-aba1-436c64849e8a","Type":"ContainerStarted","Data":"5c11513ffb1291dd1b8e98a5da01a8f89faaa2f56a98d366baf7422a570a1d48"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.953570 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8fw28" event={"ID":"7e8909c1-5ab7-4c3f-aba1-436c64849e8a","Type":"ContainerStarted","Data":"09b5bde34d89001017bc115acdb92250559639217946fba63a9f307a982d464d"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.955770 4733 generic.go:334] "Generic (PLEG): container finished" podID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerID="a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.955814 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerDied","Data":"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.958581 4733 generic.go:334] "Generic (PLEG): container finished" podID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerID="50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.958659 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerDied","Data":"50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862"} Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.960977 4733 generic.go:334] "Generic (PLEG): container finished" podID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerID="12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b" exitCode=0 Dec 06 05:46:22 crc kubenswrapper[4733]: I1206 05:46:22.961007 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerDied","Data":"12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.061441 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8fw28" podStartSLOduration=149.061420185 podStartE2EDuration="2m29.061420185s" podCreationTimestamp="2025-12-06 05:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:46:23.060284441 +0000 UTC m=+166.925495552" watchObservedRunningTime="2025-12-06 05:46:23.061420185 +0000 UTC m=+166.926631296" Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.971473 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerStarted","Data":"0be243c1fa38fcb2b039712c90d832a0632685afac32c9281d832d627166860f"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.975829 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerStarted","Data":"b1b08ade88ae154f120a0004457affe6f9f7c2afd960019a49a9127c647ba8ba"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.978745 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerStarted","Data":"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.982128 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerStarted","Data":"072a74830979f13631869cb072cf4c3477922a1d292421e95083e90aa5ce3552"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.984720 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerStarted","Data":"7a89af00601f2e8eaa39ceb69fc5035df1f45a4bcd8a7c93323a6bb802deeb9e"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.987420 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerStarted","Data":"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.989920 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerStarted","Data":"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44"} Dec 06 05:46:23 crc kubenswrapper[4733]: I1206 05:46:23.994539 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerStarted","Data":"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90"} Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.001246 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvjmk" podStartSLOduration=3.316344861 podStartE2EDuration="23.001230612s" podCreationTimestamp="2025-12-06 05:46:01 +0000 UTC" firstStartedPulling="2025-12-06 05:46:03.71055568 +0000 UTC m=+147.575766791" lastFinishedPulling="2025-12-06 05:46:23.395441431 +0000 UTC m=+167.260652542" observedRunningTime="2025-12-06 05:46:23.998955977 +0000 UTC m=+167.864167078" watchObservedRunningTime="2025-12-06 05:46:24.001230612 +0000 UTC m=+167.866441723" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.015797 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jfh8w" podStartSLOduration=2.107860435 podStartE2EDuration="26.01578618s" podCreationTimestamp="2025-12-06 05:45:58 +0000 UTC" firstStartedPulling="2025-12-06 05:45:59.540983221 +0000 UTC m=+143.406194333" lastFinishedPulling="2025-12-06 05:46:23.448908976 +0000 UTC m=+167.314120078" observedRunningTime="2025-12-06 05:46:24.012875659 +0000 UTC m=+167.878086771" watchObservedRunningTime="2025-12-06 05:46:24.01578618 +0000 UTC m=+167.880997291" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.026914 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lblzc" podStartSLOduration=3.101010597 podStartE2EDuration="26.026891063s" podCreationTimestamp="2025-12-06 05:45:58 +0000 UTC" firstStartedPulling="2025-12-06 05:46:00.567734839 +0000 UTC m=+144.432945951" lastFinishedPulling="2025-12-06 05:46:23.493615305 +0000 UTC m=+167.358826417" observedRunningTime="2025-12-06 05:46:24.026021298 +0000 UTC m=+167.891232409" watchObservedRunningTime="2025-12-06 05:46:24.026891063 +0000 UTC m=+167.892102163" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.043676 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-77dc2" podStartSLOduration=2.322704424 podStartE2EDuration="22.043664579s" podCreationTimestamp="2025-12-06 05:46:02 +0000 UTC" firstStartedPulling="2025-12-06 05:46:03.714420545 +0000 UTC m=+147.579631655" lastFinishedPulling="2025-12-06 05:46:23.435380699 +0000 UTC m=+167.300591810" observedRunningTime="2025-12-06 05:46:24.042930549 +0000 UTC m=+167.908141661" watchObservedRunningTime="2025-12-06 05:46:24.043664579 +0000 UTC m=+167.908875690" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.056437 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qk6t8" podStartSLOduration=2.180313752 podStartE2EDuration="25.056429131s" podCreationTimestamp="2025-12-06 05:45:59 +0000 UTC" firstStartedPulling="2025-12-06 05:46:00.551515865 +0000 UTC m=+144.416726976" lastFinishedPulling="2025-12-06 05:46:23.427631244 +0000 UTC m=+167.292842355" observedRunningTime="2025-12-06 05:46:24.055469908 +0000 UTC m=+167.920681019" watchObservedRunningTime="2025-12-06 05:46:24.056429131 +0000 UTC m=+167.921640242" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.078114 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vqm6" podStartSLOduration=2.200404523 podStartE2EDuration="25.078107006s" podCreationTimestamp="2025-12-06 05:45:59 +0000 UTC" firstStartedPulling="2025-12-06 05:46:00.568633108 +0000 UTC m=+144.433844218" lastFinishedPulling="2025-12-06 05:46:23.44633559 +0000 UTC m=+167.311546701" observedRunningTime="2025-12-06 05:46:24.074062023 +0000 UTC m=+167.939273135" watchObservedRunningTime="2025-12-06 05:46:24.078107006 +0000 UTC m=+167.943318117" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.095433 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqg5z" podStartSLOduration=2.3850816200000002 podStartE2EDuration="23.095411492s" podCreationTimestamp="2025-12-06 05:46:01 +0000 UTC" firstStartedPulling="2025-12-06 05:46:02.701585593 +0000 UTC m=+146.566796704" lastFinishedPulling="2025-12-06 05:46:23.411915465 +0000 UTC m=+167.277126576" observedRunningTime="2025-12-06 05:46:24.092422583 +0000 UTC m=+167.957633694" watchObservedRunningTime="2025-12-06 05:46:24.095411492 +0000 UTC m=+167.960622602" Dec 06 05:46:24 crc kubenswrapper[4733]: I1206 05:46:24.110439 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzdr7" podStartSLOduration=3.302690148 podStartE2EDuration="24.11042007s" podCreationTimestamp="2025-12-06 05:46:00 +0000 UTC" firstStartedPulling="2025-12-06 05:46:02.701858407 +0000 UTC m=+146.567069517" lastFinishedPulling="2025-12-06 05:46:23.509588328 +0000 UTC m=+167.374799439" observedRunningTime="2025-12-06 05:46:24.109108045 +0000 UTC m=+167.974319156" watchObservedRunningTime="2025-12-06 05:46:24.11042007 +0000 UTC m=+167.975631182" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.067598 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.068205 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.146117 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.290871 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.291188 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.327699 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.483419 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.483463 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.516044 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.676896 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.677662 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:29 crc kubenswrapper[4733]: I1206 05:46:29.716864 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:30 crc kubenswrapper[4733]: I1206 05:46:30.061412 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:46:30 crc kubenswrapper[4733]: I1206 05:46:30.062422 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:46:30 crc kubenswrapper[4733]: I1206 05:46:30.063116 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:30 crc kubenswrapper[4733]: I1206 05:46:30.063294 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:30 crc kubenswrapper[4733]: I1206 05:46:30.946244 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.276614 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.276688 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.315241 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.542184 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.675868 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.675921 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:31 crc kubenswrapper[4733]: I1206 05:46:31.717387 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.053358 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qk6t8" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="registry-server" containerID="cri-o://deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44" gracePeriod=2 Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.053499 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vqm6" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="registry-server" containerID="cri-o://7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5" gracePeriod=2 Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.089175 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.091502 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.259100 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.259143 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.334890 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.456406 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.461545 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640213 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities\") pod \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640263 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities\") pod \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640350 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content\") pod \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640476 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjpsg\" (UniqueName: \"kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg\") pod \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640512 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvh9j\" (UniqueName: \"kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j\") pod \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\" (UID: \"0c3721ae-c7d6-49d0-8488-e84f96e08faa\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640546 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content\") pod \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\" (UID: \"e7da475c-bd42-421a-9aef-4eed3aacbe9a\") " Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640792 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities" (OuterVolumeSpecName: "utilities") pod "e7da475c-bd42-421a-9aef-4eed3aacbe9a" (UID: "e7da475c-bd42-421a-9aef-4eed3aacbe9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.640879 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities" (OuterVolumeSpecName: "utilities") pod "0c3721ae-c7d6-49d0-8488-e84f96e08faa" (UID: "0c3721ae-c7d6-49d0-8488-e84f96e08faa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.655985 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j" (OuterVolumeSpecName: "kube-api-access-lvh9j") pod "0c3721ae-c7d6-49d0-8488-e84f96e08faa" (UID: "0c3721ae-c7d6-49d0-8488-e84f96e08faa"). InnerVolumeSpecName "kube-api-access-lvh9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.656020 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg" (OuterVolumeSpecName: "kube-api-access-vjpsg") pod "e7da475c-bd42-421a-9aef-4eed3aacbe9a" (UID: "e7da475c-bd42-421a-9aef-4eed3aacbe9a"). InnerVolumeSpecName "kube-api-access-vjpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.660429 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.660478 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.685326 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c3721ae-c7d6-49d0-8488-e84f96e08faa" (UID: "0c3721ae-c7d6-49d0-8488-e84f96e08faa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.685507 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7da475c-bd42-421a-9aef-4eed3aacbe9a" (UID: "e7da475c-bd42-421a-9aef-4eed3aacbe9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.694675 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743082 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743115 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743126 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3721ae-c7d6-49d0-8488-e84f96e08faa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743137 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjpsg\" (UniqueName: \"kubernetes.io/projected/e7da475c-bd42-421a-9aef-4eed3aacbe9a-kube-api-access-vjpsg\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743154 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvh9j\" (UniqueName: \"kubernetes.io/projected/0c3721ae-c7d6-49d0-8488-e84f96e08faa-kube-api-access-lvh9j\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:32 crc kubenswrapper[4733]: I1206 05:46:32.743164 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7da475c-bd42-421a-9aef-4eed3aacbe9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.062848 4733 generic.go:334] "Generic (PLEG): container finished" podID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerID="deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44" exitCode=0 Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.062915 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk6t8" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.062936 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerDied","Data":"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44"} Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.062988 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk6t8" event={"ID":"e7da475c-bd42-421a-9aef-4eed3aacbe9a","Type":"ContainerDied","Data":"5ae99f3e548d1f60aa4f13443eb8f4ffd8b4170427bdb77774c2363952405a3a"} Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.063009 4733 scope.go:117] "RemoveContainer" containerID="deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.065918 4733 generic.go:334] "Generic (PLEG): container finished" podID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerID="7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5" exitCode=0 Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.066461 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerDied","Data":"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5"} Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.066491 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vqm6" event={"ID":"0c3721ae-c7d6-49d0-8488-e84f96e08faa","Type":"ContainerDied","Data":"54de35c390f5f924dd237d13a53b91e20861deec429359bd4554d6ff56373268"} Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.066561 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vqm6" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.085136 4733 scope.go:117] "RemoveContainer" containerID="12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.097460 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.100844 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qk6t8"] Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.105297 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.109743 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.118604 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vqm6"] Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.122524 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.123639 4733 scope.go:117] "RemoveContainer" containerID="c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.146679 4733 scope.go:117] "RemoveContainer" containerID="deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.149344 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44\": container with ID starting with deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44 not found: ID does not exist" containerID="deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.149396 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44"} err="failed to get container status \"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44\": rpc error: code = NotFound desc = could not find container \"deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44\": container with ID starting with deebff28db04bff95025b6c0ac3c83ab8559aaa646034289ca51633990e9cb44 not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.149479 4733 scope.go:117] "RemoveContainer" containerID="12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.150227 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b\": container with ID starting with 12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b not found: ID does not exist" containerID="12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.150281 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b"} err="failed to get container status \"12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b\": rpc error: code = NotFound desc = could not find container \"12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b\": container with ID starting with 12da77c295a6061d401ac8bd5fc0c45e6de15e2fb7a08981cc6e8485ff62fd0b not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.150340 4733 scope.go:117] "RemoveContainer" containerID="c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.150658 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345\": container with ID starting with c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345 not found: ID does not exist" containerID="c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.150699 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345"} err="failed to get container status \"c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345\": rpc error: code = NotFound desc = could not find container \"c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345\": container with ID starting with c47593b63b29c97fa85c5e024a82f9ce35bff9b240eb731ab3c1d7f06d266345 not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.150725 4733 scope.go:117] "RemoveContainer" containerID="7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.162315 4733 scope.go:117] "RemoveContainer" containerID="50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.182497 4733 scope.go:117] "RemoveContainer" containerID="408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.189953 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-52jkz" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.196511 4733 scope.go:117] "RemoveContainer" containerID="7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.197052 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5\": container with ID starting with 7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5 not found: ID does not exist" containerID="7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.197082 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5"} err="failed to get container status \"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5\": rpc error: code = NotFound desc = could not find container \"7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5\": container with ID starting with 7142d972103bf50a94b303191950b06a611d9c8f93c325a052279ad5a981d6a5 not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.197102 4733 scope.go:117] "RemoveContainer" containerID="50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.197518 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862\": container with ID starting with 50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862 not found: ID does not exist" containerID="50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.197549 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862"} err="failed to get container status \"50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862\": rpc error: code = NotFound desc = could not find container \"50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862\": container with ID starting with 50f2b029fad3b1023c0ff2283c0cc4d40708c94f99715ec6390d9604caf5f862 not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.197577 4733 scope.go:117] "RemoveContainer" containerID="408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd" Dec 06 05:46:33 crc kubenswrapper[4733]: E1206 05:46:33.198164 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd\": container with ID starting with 408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd not found: ID does not exist" containerID="408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.198196 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd"} err="failed to get container status \"408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd\": rpc error: code = NotFound desc = could not find container \"408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd\": container with ID starting with 408c7c6ce499a3022e481b09a8e9d2f3c5e650b5286454160a5dcecf6957f7bd not found: ID does not exist" Dec 06 05:46:33 crc kubenswrapper[4733]: I1206 05:46:33.945732 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.081329 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqg5z" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="registry-server" containerID="cri-o://f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90" gracePeriod=2 Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.384591 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.492553 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" path="/var/lib/kubelet/pods/0c3721ae-c7d6-49d0-8488-e84f96e08faa/volumes" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.493380 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" path="/var/lib/kubelet/pods/e7da475c-bd42-421a-9aef-4eed3aacbe9a/volumes" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.565147 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities\") pod \"755fa11c-904b-49e4-928c-da2935842c80\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.565225 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content\") pod \"755fa11c-904b-49e4-928c-da2935842c80\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.566236 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59nr9\" (UniqueName: \"kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9\") pod \"755fa11c-904b-49e4-928c-da2935842c80\" (UID: \"755fa11c-904b-49e4-928c-da2935842c80\") " Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.565763 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities" (OuterVolumeSpecName: "utilities") pod "755fa11c-904b-49e4-928c-da2935842c80" (UID: "755fa11c-904b-49e4-928c-da2935842c80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.566724 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.571268 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9" (OuterVolumeSpecName: "kube-api-access-59nr9") pod "755fa11c-904b-49e4-928c-da2935842c80" (UID: "755fa11c-904b-49e4-928c-da2935842c80"). InnerVolumeSpecName "kube-api-access-59nr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.580487 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "755fa11c-904b-49e4-928c-da2935842c80" (UID: "755fa11c-904b-49e4-928c-da2935842c80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.667489 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755fa11c-904b-49e4-928c-da2935842c80-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:34 crc kubenswrapper[4733]: I1206 05:46:34.667639 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59nr9\" (UniqueName: \"kubernetes.io/projected/755fa11c-904b-49e4-928c-da2935842c80-kube-api-access-59nr9\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.094996 4733 generic.go:334] "Generic (PLEG): container finished" podID="755fa11c-904b-49e4-928c-da2935842c80" containerID="f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90" exitCode=0 Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.095229 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqg5z" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.095221 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerDied","Data":"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90"} Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.096500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqg5z" event={"ID":"755fa11c-904b-49e4-928c-da2935842c80","Type":"ContainerDied","Data":"98e2ff6c34c144a95b2b27c91608f22314fd1c24a3f230a1a09ce08eacb57a9a"} Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.096532 4733 scope.go:117] "RemoveContainer" containerID="f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.117555 4733 scope.go:117] "RemoveContainer" containerID="1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.120569 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.133009 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqg5z"] Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.140110 4733 scope.go:117] "RemoveContainer" containerID="d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.159742 4733 scope.go:117] "RemoveContainer" containerID="f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.160056 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90\": container with ID starting with f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90 not found: ID does not exist" containerID="f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.160089 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90"} err="failed to get container status \"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90\": rpc error: code = NotFound desc = could not find container \"f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90\": container with ID starting with f83592503e7bc13bbcfdbcda1774fb88e5e26637f97876686cd49a5064ee9f90 not found: ID does not exist" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.160112 4733 scope.go:117] "RemoveContainer" containerID="1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.160355 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f\": container with ID starting with 1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f not found: ID does not exist" containerID="1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.160377 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f"} err="failed to get container status \"1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f\": rpc error: code = NotFound desc = could not find container \"1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f\": container with ID starting with 1c81a682724848312ec18654cb7fe1a4fecd3a745f90ac2bdd4c6f327497cb2f not found: ID does not exist" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.160390 4733 scope.go:117] "RemoveContainer" containerID="d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.160634 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade\": container with ID starting with d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade not found: ID does not exist" containerID="d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.160652 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade"} err="failed to get container status \"d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade\": rpc error: code = NotFound desc = could not find container \"d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade\": container with ID starting with d5c7700b629d94813acd3b5e5120a2efbcf13a7a61aba2af1f001f91dca04ade not found: ID does not exist" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563448 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563715 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58095812-8308-43c6-a12c-305a33525986" containerName="pruner" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563735 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="58095812-8308-43c6-a12c-305a33525986" containerName="pruner" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563751 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563759 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563775 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563789 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563802 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563809 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563821 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563828 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563837 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563846 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563855 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563862 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563869 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563875 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563889 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563895 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="extract-content" Dec 06 05:46:35 crc kubenswrapper[4733]: E1206 05:46:35.563903 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.563911 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="extract-utilities" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.564056 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="58095812-8308-43c6-a12c-305a33525986" containerName="pruner" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.564066 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="755fa11c-904b-49e4-928c-da2935842c80" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.564078 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7da475c-bd42-421a-9aef-4eed3aacbe9a" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.564091 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3721ae-c7d6-49d0-8488-e84f96e08faa" containerName="registry-server" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.564582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.566297 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.566386 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.572507 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.576407 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.576464 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.678329 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.678398 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.678541 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.695360 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:35 crc kubenswrapper[4733]: I1206 05:46:35.880936 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.265557 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.339227 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.339556 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-77dc2" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="registry-server" containerID="cri-o://23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b" gracePeriod=2 Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.495964 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755fa11c-904b-49e4-928c-da2935842c80" path="/var/lib/kubelet/pods/755fa11c-904b-49e4-928c-da2935842c80/volumes" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.619080 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.790578 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content\") pod \"9fc6d80b-251a-4d93-91f1-59adbba493f7\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.790635 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfgfq\" (UniqueName: \"kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq\") pod \"9fc6d80b-251a-4d93-91f1-59adbba493f7\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.790665 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities\") pod \"9fc6d80b-251a-4d93-91f1-59adbba493f7\" (UID: \"9fc6d80b-251a-4d93-91f1-59adbba493f7\") " Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.791398 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities" (OuterVolumeSpecName: "utilities") pod "9fc6d80b-251a-4d93-91f1-59adbba493f7" (UID: "9fc6d80b-251a-4d93-91f1-59adbba493f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.795656 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq" (OuterVolumeSpecName: "kube-api-access-rfgfq") pod "9fc6d80b-251a-4d93-91f1-59adbba493f7" (UID: "9fc6d80b-251a-4d93-91f1-59adbba493f7"). InnerVolumeSpecName "kube-api-access-rfgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.866775 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fc6d80b-251a-4d93-91f1-59adbba493f7" (UID: "9fc6d80b-251a-4d93-91f1-59adbba493f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.892425 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfgfq\" (UniqueName: \"kubernetes.io/projected/9fc6d80b-251a-4d93-91f1-59adbba493f7-kube-api-access-rfgfq\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.892451 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:36 crc kubenswrapper[4733]: I1206 05:46:36.892464 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc6d80b-251a-4d93-91f1-59adbba493f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.116755 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f","Type":"ContainerStarted","Data":"c7ac6bbffc3fdb91547dc904db2a60abeb858dc66596ac57924c5988b47e55b3"} Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.116808 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f","Type":"ContainerStarted","Data":"914eee604118b6b744ddbf0b4898ba91a80c3933f5dc7992ec999b2ead710786"} Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.120815 4733 generic.go:334] "Generic (PLEG): container finished" podID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerID="23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b" exitCode=0 Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.120873 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerDied","Data":"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b"} Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.120915 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77dc2" event={"ID":"9fc6d80b-251a-4d93-91f1-59adbba493f7","Type":"ContainerDied","Data":"4a0df095e072d3d2629f8e26e2259b367b7ab70db27557cf213ef921f5d642b0"} Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.120944 4733 scope.go:117] "RemoveContainer" containerID="23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.121145 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77dc2" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.140753 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.140719776 podStartE2EDuration="2.140719776s" podCreationTimestamp="2025-12-06 05:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:46:37.134637713 +0000 UTC m=+180.999848825" watchObservedRunningTime="2025-12-06 05:46:37.140719776 +0000 UTC m=+181.005930886" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.144549 4733 scope.go:117] "RemoveContainer" containerID="a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.161967 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.165592 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-77dc2"] Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.169847 4733 scope.go:117] "RemoveContainer" containerID="3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.184270 4733 scope.go:117] "RemoveContainer" containerID="23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b" Dec 06 05:46:37 crc kubenswrapper[4733]: E1206 05:46:37.184721 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b\": container with ID starting with 23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b not found: ID does not exist" containerID="23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.184765 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b"} err="failed to get container status \"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b\": rpc error: code = NotFound desc = could not find container \"23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b\": container with ID starting with 23b3e0ba2bde54a244d18f8fb82ae54e5cdb1f89c039d36089d348e811581e5b not found: ID does not exist" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.184791 4733 scope.go:117] "RemoveContainer" containerID="a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a" Dec 06 05:46:37 crc kubenswrapper[4733]: E1206 05:46:37.185139 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a\": container with ID starting with a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a not found: ID does not exist" containerID="a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.185172 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a"} err="failed to get container status \"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a\": rpc error: code = NotFound desc = could not find container \"a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a\": container with ID starting with a7d0b05920a1d4d3187590867c23d4600e1b4488b30ea8fcee739df66490de1a not found: ID does not exist" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.185196 4733 scope.go:117] "RemoveContainer" containerID="3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d" Dec 06 05:46:37 crc kubenswrapper[4733]: E1206 05:46:37.185682 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d\": container with ID starting with 3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d not found: ID does not exist" containerID="3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d" Dec 06 05:46:37 crc kubenswrapper[4733]: I1206 05:46:37.185704 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d"} err="failed to get container status \"3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d\": rpc error: code = NotFound desc = could not find container \"3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d\": container with ID starting with 3f244a3a76647660283fb0e418edfb39d8dcb7a2b7a08babde56b865efe4d88d not found: ID does not exist" Dec 06 05:46:38 crc kubenswrapper[4733]: I1206 05:46:38.129526 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" containerID="c7ac6bbffc3fdb91547dc904db2a60abeb858dc66596ac57924c5988b47e55b3" exitCode=0 Dec 06 05:46:38 crc kubenswrapper[4733]: I1206 05:46:38.129619 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f","Type":"ContainerDied","Data":"c7ac6bbffc3fdb91547dc904db2a60abeb858dc66596ac57924c5988b47e55b3"} Dec 06 05:46:38 crc kubenswrapper[4733]: I1206 05:46:38.490436 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" path="/var/lib/kubelet/pods/9fc6d80b-251a-4d93-91f1-59adbba493f7/volumes" Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.343892 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.522264 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access\") pod \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.522324 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir\") pod \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\" (UID: \"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f\") " Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.522522 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" (UID: "6f289b10-26a8-4f0e-bb39-ef7f47b3d64f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.527771 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" (UID: "6f289b10-26a8-4f0e-bb39-ef7f47b3d64f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.624250 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:39 crc kubenswrapper[4733]: I1206 05:46:39.624299 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f289b10-26a8-4f0e-bb39-ef7f47b3d64f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:46:40 crc kubenswrapper[4733]: I1206 05:46:40.146992 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6f289b10-26a8-4f0e-bb39-ef7f47b3d64f","Type":"ContainerDied","Data":"914eee604118b6b744ddbf0b4898ba91a80c3933f5dc7992ec999b2ead710786"} Dec 06 05:46:40 crc kubenswrapper[4733]: I1206 05:46:40.147051 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 05:46:40 crc kubenswrapper[4733]: I1206 05:46:40.147069 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914eee604118b6b744ddbf0b4898ba91a80c3933f5dc7992ec999b2ead710786" Dec 06 05:46:41 crc kubenswrapper[4733]: I1206 05:46:41.493199 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.556424 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 05:46:42 crc kubenswrapper[4733]: E1206 05:46:42.557479 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" containerName="pruner" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.557565 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" containerName="pruner" Dec 06 05:46:42 crc kubenswrapper[4733]: E1206 05:46:42.557636 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="extract-content" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.557684 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="extract-content" Dec 06 05:46:42 crc kubenswrapper[4733]: E1206 05:46:42.557731 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="extract-utilities" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.557793 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="extract-utilities" Dec 06 05:46:42 crc kubenswrapper[4733]: E1206 05:46:42.557857 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="registry-server" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.557913 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="registry-server" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.558070 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc6d80b-251a-4d93-91f1-59adbba493f7" containerName="registry-server" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.558134 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f289b10-26a8-4f0e-bb39-ef7f47b3d64f" containerName="pruner" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.558580 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.561132 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.564006 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.564041 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.606024 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.662556 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.662619 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.662681 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.763916 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.763985 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.764098 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.764185 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.764555 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.782595 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.874636 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.989285 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:46:42 crc kubenswrapper[4733]: I1206 05:46:42.989553 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:46:43 crc kubenswrapper[4733]: I1206 05:46:43.241845 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 05:46:43 crc kubenswrapper[4733]: W1206 05:46:43.246402 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb2eccc5b_1f33_4372_83e4_23e30f607d68.slice/crio-73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15 WatchSource:0}: Error finding container 73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15: Status 404 returned error can't find the container with id 73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15 Dec 06 05:46:44 crc kubenswrapper[4733]: I1206 05:46:44.178114 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2eccc5b-1f33-4372-83e4-23e30f607d68","Type":"ContainerStarted","Data":"75359317360fe7229565c60d725c05642aa054c80378eb3f3f3576c05b344b45"} Dec 06 05:46:44 crc kubenswrapper[4733]: I1206 05:46:44.178510 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2eccc5b-1f33-4372-83e4-23e30f607d68","Type":"ContainerStarted","Data":"73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15"} Dec 06 05:46:44 crc kubenswrapper[4733]: I1206 05:46:44.191611 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.191596977 podStartE2EDuration="2.191596977s" podCreationTimestamp="2025-12-06 05:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:46:44.189668873 +0000 UTC m=+188.054879984" watchObservedRunningTime="2025-12-06 05:46:44.191596977 +0000 UTC m=+188.056808089" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.514854 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" podUID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" containerName="oauth-openshift" containerID="cri-o://0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4" gracePeriod=15 Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.806926 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.833716 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8476cd6899-q2pdb"] Dec 06 05:47:06 crc kubenswrapper[4733]: E1206 05:47:06.833959 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" containerName="oauth-openshift" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.833978 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" containerName="oauth-openshift" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.834074 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" containerName="oauth-openshift" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.834505 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.843459 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8476cd6899-q2pdb"] Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.931386 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932080 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932144 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932165 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932224 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932283 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932339 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw2wq\" (UniqueName: \"kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932391 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932421 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932444 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932464 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932523 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932551 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932587 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template\") pod \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\" (UID: \"3cf2106f-7c73-4086-bc49-fa1b11f2e56f\") " Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932772 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932804 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932839 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-dir\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932869 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98gth\" (UniqueName: \"kubernetes.io/projected/b92fe062-ed39-46a5-b9fe-2e8443e565ec-kube-api-access-98gth\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932891 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932919 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932953 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932929 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932978 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-login\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.932996 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-error\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933085 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933139 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-policies\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933159 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-session\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933170 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933192 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933216 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933239 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933281 4733 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933294 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933321 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933402 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.933736 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.939444 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq" (OuterVolumeSpecName: "kube-api-access-lw2wq") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "kube-api-access-lw2wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.939921 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.940059 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.940646 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.940847 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.942445 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.943197 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.943787 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:06 crc kubenswrapper[4733]: I1206 05:47:06.944347 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3cf2106f-7c73-4086-bc49-fa1b11f2e56f" (UID: "3cf2106f-7c73-4086-bc49-fa1b11f2e56f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034342 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034396 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-login\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034427 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-error\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034460 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034486 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-policies\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034505 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-session\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034535 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034583 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034622 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034645 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-dir\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034674 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98gth\" (UniqueName: \"kubernetes.io/projected/b92fe062-ed39-46a5-b9fe-2e8443e565ec-kube-api-access-98gth\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034703 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034797 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034812 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034824 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034836 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034847 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034862 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034872 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034883 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034897 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034910 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw2wq\" (UniqueName: \"kubernetes.io/projected/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-kube-api-access-lw2wq\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.034924 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3cf2106f-7c73-4086-bc49-fa1b11f2e56f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.035171 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.035559 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-dir\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.035726 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-audit-policies\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.036221 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038111 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-error\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038229 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-session\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038276 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038277 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038441 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-login\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.038942 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.039383 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.039627 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.040589 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b92fe062-ed39-46a5-b9fe-2e8443e565ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.049833 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98gth\" (UniqueName: \"kubernetes.io/projected/b92fe062-ed39-46a5-b9fe-2e8443e565ec-kube-api-access-98gth\") pod \"oauth-openshift-8476cd6899-q2pdb\" (UID: \"b92fe062-ed39-46a5-b9fe-2e8443e565ec\") " pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.147405 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.297238 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8476cd6899-q2pdb"] Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.314386 4733 generic.go:334] "Generic (PLEG): container finished" podID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" containerID="0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4" exitCode=0 Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.314436 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.314502 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" event={"ID":"3cf2106f-7c73-4086-bc49-fa1b11f2e56f","Type":"ContainerDied","Data":"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4"} Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.314558 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l8cj4" event={"ID":"3cf2106f-7c73-4086-bc49-fa1b11f2e56f","Type":"ContainerDied","Data":"05fde8fc82a8f83cfd8e76f124e6b7536ceac2288dd2ccfa0e4b58b6d32f2b95"} Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.314586 4733 scope.go:117] "RemoveContainer" containerID="0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.315550 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" event={"ID":"b92fe062-ed39-46a5-b9fe-2e8443e565ec","Type":"ContainerStarted","Data":"a7072a79109647dea250f5bda645e0b795f50d29c418454834dd9aaf6cbfc74d"} Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.330259 4733 scope.go:117] "RemoveContainer" containerID="0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4" Dec 06 05:47:07 crc kubenswrapper[4733]: E1206 05:47:07.330864 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4\": container with ID starting with 0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4 not found: ID does not exist" containerID="0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.330911 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4"} err="failed to get container status \"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4\": rpc error: code = NotFound desc = could not find container \"0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4\": container with ID starting with 0db7972d7003eb14ebc2f889b7a66ee9c34c11513b62b24a6667c6621d1f43b4 not found: ID does not exist" Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.340048 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:47:07 crc kubenswrapper[4733]: I1206 05:47:07.343948 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l8cj4"] Dec 06 05:47:08 crc kubenswrapper[4733]: I1206 05:47:08.323108 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" event={"ID":"b92fe062-ed39-46a5-b9fe-2e8443e565ec","Type":"ContainerStarted","Data":"bb03be8ed988e19332aec1b5f53de0fbbff8a0957aef4f1cecac0e3e1b0edbc8"} Dec 06 05:47:08 crc kubenswrapper[4733]: I1206 05:47:08.323414 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:08 crc kubenswrapper[4733]: I1206 05:47:08.328502 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" Dec 06 05:47:08 crc kubenswrapper[4733]: I1206 05:47:08.342270 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8476cd6899-q2pdb" podStartSLOduration=27.342256189 podStartE2EDuration="27.342256189s" podCreationTimestamp="2025-12-06 05:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:47:08.338853562 +0000 UTC m=+212.204064673" watchObservedRunningTime="2025-12-06 05:47:08.342256189 +0000 UTC m=+212.207467300" Dec 06 05:47:08 crc kubenswrapper[4733]: I1206 05:47:08.491137 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf2106f-7c73-4086-bc49-fa1b11f2e56f" path="/var/lib/kubelet/pods/3cf2106f-7c73-4086-bc49-fa1b11f2e56f/volumes" Dec 06 05:47:12 crc kubenswrapper[4733]: I1206 05:47:12.988986 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:47:12 crc kubenswrapper[4733]: I1206 05:47:12.989336 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:47:12 crc kubenswrapper[4733]: I1206 05:47:12.989386 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:47:12 crc kubenswrapper[4733]: I1206 05:47:12.989935 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 05:47:12 crc kubenswrapper[4733]: I1206 05:47:12.989985 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941" gracePeriod=600 Dec 06 05:47:13 crc kubenswrapper[4733]: I1206 05:47:13.352494 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941" exitCode=0 Dec 06 05:47:13 crc kubenswrapper[4733]: I1206 05:47:13.352578 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941"} Dec 06 05:47:13 crc kubenswrapper[4733]: I1206 05:47:13.352828 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.226822 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lblzc"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.227534 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lblzc" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="registry-server" containerID="cri-o://072a74830979f13631869cb072cf4c3477922a1d292421e95083e90aa5ce3552" gracePeriod=30 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.236226 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jfh8w"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.236449 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jfh8w" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="registry-server" containerID="cri-o://7a89af00601f2e8eaa39ceb69fc5035df1f45a4bcd8a7c93323a6bb802deeb9e" gracePeriod=30 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.240994 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbnvh"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.241163 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" containerID="cri-o://ce8efb373b2a9d6229dc40f7cfec2885a85447019407c767a164b7240d6d62dc" gracePeriod=30 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.254797 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xq44c"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.259541 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzdr7"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.259695 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzdr7" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="registry-server" containerID="cri-o://b1b08ade88ae154f120a0004457affe6f9f7c2afd960019a49a9127c647ba8ba" gracePeriod=30 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.259795 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.263406 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xq44c"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.265491 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvjmk"] Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.265628 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvjmk" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="registry-server" containerID="cri-o://0be243c1fa38fcb2b039712c90d832a0632685afac32c9281d832d627166860f" gracePeriod=30 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.285765 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477154e1-6166-41c9-beb3-1248e1583324-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.285811 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtf5v\" (UniqueName: \"kubernetes.io/projected/477154e1-6166-41c9-beb3-1248e1583324-kube-api-access-vtf5v\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.285833 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/477154e1-6166-41c9-beb3-1248e1583324-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.390777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477154e1-6166-41c9-beb3-1248e1583324-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.390818 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtf5v\" (UniqueName: \"kubernetes.io/projected/477154e1-6166-41c9-beb3-1248e1583324-kube-api-access-vtf5v\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.390839 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/477154e1-6166-41c9-beb3-1248e1583324-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.392434 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477154e1-6166-41c9-beb3-1248e1583324-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.398513 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/477154e1-6166-41c9-beb3-1248e1583324-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.408643 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtf5v\" (UniqueName: \"kubernetes.io/projected/477154e1-6166-41c9-beb3-1248e1583324-kube-api-access-vtf5v\") pod \"marketplace-operator-79b997595-xq44c\" (UID: \"477154e1-6166-41c9-beb3-1248e1583324\") " pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.410750 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerID="072a74830979f13631869cb072cf4c3477922a1d292421e95083e90aa5ce3552" exitCode=0 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.410813 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerDied","Data":"072a74830979f13631869cb072cf4c3477922a1d292421e95083e90aa5ce3552"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.420394 4733 generic.go:334] "Generic (PLEG): container finished" podID="71e51a04-4769-45b0-87b8-7292977ec73b" containerID="7a89af00601f2e8eaa39ceb69fc5035df1f45a4bcd8a7c93323a6bb802deeb9e" exitCode=0 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.420450 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerDied","Data":"7a89af00601f2e8eaa39ceb69fc5035df1f45a4bcd8a7c93323a6bb802deeb9e"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.421824 4733 generic.go:334] "Generic (PLEG): container finished" podID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerID="b1b08ade88ae154f120a0004457affe6f9f7c2afd960019a49a9127c647ba8ba" exitCode=0 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.421863 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerDied","Data":"b1b08ade88ae154f120a0004457affe6f9f7c2afd960019a49a9127c647ba8ba"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.428737 4733 generic.go:334] "Generic (PLEG): container finished" podID="938df03e-d0d5-4b93-9a31-061262420f18" containerID="0be243c1fa38fcb2b039712c90d832a0632685afac32c9281d832d627166860f" exitCode=0 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.428797 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerDied","Data":"0be243c1fa38fcb2b039712c90d832a0632685afac32c9281d832d627166860f"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.429702 4733 generic.go:334] "Generic (PLEG): container finished" podID="76185470-be08-49f9-ab30-59314702bc08" containerID="ce8efb373b2a9d6229dc40f7cfec2885a85447019407c767a164b7240d6d62dc" exitCode=0 Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.429721 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" event={"ID":"76185470-be08-49f9-ab30-59314702bc08","Type":"ContainerDied","Data":"ce8efb373b2a9d6229dc40f7cfec2885a85447019407c767a164b7240d6d62dc"} Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.579887 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.659414 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.663334 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.666761 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.671128 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.698572 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content\") pod \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700023 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca\") pod \"76185470-be08-49f9-ab30-59314702bc08\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700070 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics\") pod \"76185470-be08-49f9-ab30-59314702bc08\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700090 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities\") pod \"938df03e-d0d5-4b93-9a31-061262420f18\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700119 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9279\" (UniqueName: \"kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279\") pod \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700138 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content\") pod \"938df03e-d0d5-4b93-9a31-061262420f18\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700160 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z7ft\" (UniqueName: \"kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft\") pod \"139eeaf9-21f6-4032-9b24-73534a803ca5\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700209 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfpq6\" (UniqueName: \"kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6\") pod \"76185470-be08-49f9-ab30-59314702bc08\" (UID: \"76185470-be08-49f9-ab30-59314702bc08\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.700236 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content\") pod \"139eeaf9-21f6-4032-9b24-73534a803ca5\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.701825 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "76185470-be08-49f9-ab30-59314702bc08" (UID: "76185470-be08-49f9-ab30-59314702bc08"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.705900 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities" (OuterVolumeSpecName: "utilities") pod "938df03e-d0d5-4b93-9a31-061262420f18" (UID: "938df03e-d0d5-4b93-9a31-061262420f18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.707686 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.708675 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6" (OuterVolumeSpecName: "kube-api-access-gfpq6") pod "76185470-be08-49f9-ab30-59314702bc08" (UID: "76185470-be08-49f9-ab30-59314702bc08"). InnerVolumeSpecName "kube-api-access-gfpq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.724355 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "139eeaf9-21f6-4032-9b24-73534a803ca5" (UID: "139eeaf9-21f6-4032-9b24-73534a803ca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.737986 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft" (OuterVolumeSpecName: "kube-api-access-7z7ft") pod "139eeaf9-21f6-4032-9b24-73534a803ca5" (UID: "139eeaf9-21f6-4032-9b24-73534a803ca5"). InnerVolumeSpecName "kube-api-access-7z7ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.743813 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279" (OuterVolumeSpecName: "kube-api-access-f9279") pod "b4bfb477-5389-4827-91d8-cfd61ad2d5f8" (UID: "b4bfb477-5389-4827-91d8-cfd61ad2d5f8"). InnerVolumeSpecName "kube-api-access-f9279". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.744253 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "76185470-be08-49f9-ab30-59314702bc08" (UID: "76185470-be08-49f9-ab30-59314702bc08"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.759272 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4bfb477-5389-4827-91d8-cfd61ad2d5f8" (UID: "b4bfb477-5389-4827-91d8-cfd61ad2d5f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.801635 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7tmx\" (UniqueName: \"kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx\") pod \"938df03e-d0d5-4b93-9a31-061262420f18\" (UID: \"938df03e-d0d5-4b93-9a31-061262420f18\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.801672 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content\") pod \"71e51a04-4769-45b0-87b8-7292977ec73b\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802133 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities\") pod \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\" (UID: \"b4bfb477-5389-4827-91d8-cfd61ad2d5f8\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802159 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities\") pod \"139eeaf9-21f6-4032-9b24-73534a803ca5\" (UID: \"139eeaf9-21f6-4032-9b24-73534a803ca5\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802181 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities\") pod \"71e51a04-4769-45b0-87b8-7292977ec73b\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802206 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2r5k\" (UniqueName: \"kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k\") pod \"71e51a04-4769-45b0-87b8-7292977ec73b\" (UID: \"71e51a04-4769-45b0-87b8-7292977ec73b\") " Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802709 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802725 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76185470-be08-49f9-ab30-59314702bc08-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802735 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76185470-be08-49f9-ab30-59314702bc08-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802744 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802752 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9279\" (UniqueName: \"kubernetes.io/projected/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-kube-api-access-f9279\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802761 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z7ft\" (UniqueName: \"kubernetes.io/projected/139eeaf9-21f6-4032-9b24-73534a803ca5-kube-api-access-7z7ft\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802770 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfpq6\" (UniqueName: \"kubernetes.io/projected/76185470-be08-49f9-ab30-59314702bc08-kube-api-access-gfpq6\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.802778 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.803264 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities" (OuterVolumeSpecName: "utilities") pod "71e51a04-4769-45b0-87b8-7292977ec73b" (UID: "71e51a04-4769-45b0-87b8-7292977ec73b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.803257 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities" (OuterVolumeSpecName: "utilities") pod "b4bfb477-5389-4827-91d8-cfd61ad2d5f8" (UID: "b4bfb477-5389-4827-91d8-cfd61ad2d5f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.803398 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities" (OuterVolumeSpecName: "utilities") pod "139eeaf9-21f6-4032-9b24-73534a803ca5" (UID: "139eeaf9-21f6-4032-9b24-73534a803ca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.804953 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k" (OuterVolumeSpecName: "kube-api-access-f2r5k") pod "71e51a04-4769-45b0-87b8-7292977ec73b" (UID: "71e51a04-4769-45b0-87b8-7292977ec73b"). InnerVolumeSpecName "kube-api-access-f2r5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.810184 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx" (OuterVolumeSpecName: "kube-api-access-j7tmx") pod "938df03e-d0d5-4b93-9a31-061262420f18" (UID: "938df03e-d0d5-4b93-9a31-061262420f18"). InnerVolumeSpecName "kube-api-access-j7tmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.829613 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "938df03e-d0d5-4b93-9a31-061262420f18" (UID: "938df03e-d0d5-4b93-9a31-061262420f18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.851403 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71e51a04-4769-45b0-87b8-7292977ec73b" (UID: "71e51a04-4769-45b0-87b8-7292977ec73b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903818 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7tmx\" (UniqueName: \"kubernetes.io/projected/938df03e-d0d5-4b93-9a31-061262420f18-kube-api-access-j7tmx\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903853 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903865 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bfb477-5389-4827-91d8-cfd61ad2d5f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903877 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139eeaf9-21f6-4032-9b24-73534a803ca5-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903888 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e51a04-4769-45b0-87b8-7292977ec73b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903898 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2r5k\" (UniqueName: \"kubernetes.io/projected/71e51a04-4769-45b0-87b8-7292977ec73b-kube-api-access-f2r5k\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:20 crc kubenswrapper[4733]: I1206 05:47:20.903908 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938df03e-d0d5-4b93-9a31-061262420f18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.031733 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xq44c"] Dec 06 05:47:21 crc kubenswrapper[4733]: W1206 05:47:21.037218 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477154e1_6166_41c9_beb3_1248e1583324.slice/crio-c2c4c68c98b783448c8d6c4ee17aa6f7f37ec76bcd77ec0fbc9c625ca2d8c175 WatchSource:0}: Error finding container c2c4c68c98b783448c8d6c4ee17aa6f7f37ec76bcd77ec0fbc9c625ca2d8c175: Status 404 returned error can't find the container with id c2c4c68c98b783448c8d6c4ee17aa6f7f37ec76bcd77ec0fbc9c625ca2d8c175 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.230646 4733 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.233983 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234051 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234083 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234185 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234200 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234206 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234223 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234229 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234239 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234246 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234263 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234270 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234279 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234285 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234316 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234323 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234330 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234337 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234352 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234358 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234369 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234374 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234384 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234514 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="extract-content" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.234528 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.234534 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="extract-utilities" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.235830 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="76185470-be08-49f9-ab30-59314702bc08" containerName="marketplace-operator" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.235908 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.235923 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.235936 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="938df03e-d0d5-4b93-9a31-061262420f18" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.235946 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" containerName="registry-server" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.236740 4733 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.237460 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.237470 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2" gracePeriod=15 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.237867 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f" gracePeriod=15 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.237972 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923" gracePeriod=15 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.238168 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0" gracePeriod=15 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.238585 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f" gracePeriod=15 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.241934 4733 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243541 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243560 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243579 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243585 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243816 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243824 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243832 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243840 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243882 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243888 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.243897 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.243905 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.244717 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.245420 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.246205 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.246256 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.246283 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.246330 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.247123 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.247144 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.280125 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.211:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310223 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310384 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310503 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310591 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310674 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310770 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.310844 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.311069 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412571 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412648 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412691 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412720 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412758 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412791 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412912 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412962 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.412991 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.413027 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.413049 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.413074 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.413101 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.413121 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.438696 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/0.log" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.438744 4733 generic.go:334] "Generic (PLEG): container finished" podID="477154e1-6166-41c9-beb3-1248e1583324" containerID="df1cd4c9b9812c3e1e46a2c47fdfcc06252ad771ac3d2d1e61f345fa549c6942" exitCode=1 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.438822 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerDied","Data":"df1cd4c9b9812c3e1e46a2c47fdfcc06252ad771ac3d2d1e61f345fa549c6942"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.438867 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerStarted","Data":"c2c4c68c98b783448c8d6c4ee17aa6f7f37ec76bcd77ec0fbc9c625ca2d8c175"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.439278 4733 scope.go:117] "RemoveContainer" containerID="df1cd4c9b9812c3e1e46a2c47fdfcc06252ad771ac3d2d1e61f345fa549c6942" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.440503 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.440702 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.441221 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/marketplace-operator-79b997595-xq44c.187e8a2f9ac43332\": dial tcp 192.168.25.211:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-xq44c.187e8a2f9ac43332 openshift-marketplace 29352 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-xq44c,UID:477154e1-6166-41c9-beb3-1248e1583324,APIVersion:v1,ResourceVersion:29321,FieldPath:spec.containers{marketplace-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 05:47:21 +0000 UTC,LastTimestamp:2025-12-06 05:47:21.440580351 +0000 UTC m=+225.305791463,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.443841 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzdr7" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.445040 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.444679 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzdr7" event={"ID":"139eeaf9-21f6-4032-9b24-73534a803ca5","Type":"ContainerDied","Data":"ea0550ea68b5e6dcde33b35239b23e680a7bcb93d9274d0e8eb04d106478a0d8"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.445380 4733 scope.go:117] "RemoveContainer" containerID="b1b08ade88ae154f120a0004457affe6f9f7c2afd960019a49a9127c647ba8ba" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.445390 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.445569 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.448528 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvjmk" event={"ID":"938df03e-d0d5-4b93-9a31-061262420f18","Type":"ContainerDied","Data":"07a1c791b6d356abda526945329581d4524e86d4900eea0983558c6e64756095"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.448681 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvjmk" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.449619 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.449828 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.450093 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.450515 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.451256 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.451280 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" event={"ID":"76185470-be08-49f9-ab30-59314702bc08","Type":"ContainerDied","Data":"61230c958dda2077fbe1385bc7e57e4f286bde59ae1dac7d2023649977e9a545"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.451811 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.452546 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.452933 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.453246 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.453728 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.454431 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.455677 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.456559 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f" exitCode=0 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.456587 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f" exitCode=0 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.456597 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923" exitCode=0 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.456607 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0" exitCode=2 Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.459546 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lblzc" event={"ID":"b4bfb477-5389-4827-91d8-cfd61ad2d5f8","Type":"ContainerDied","Data":"d5079c758c9c0a1b0000c3217e8d6ad1459847f8480bd77d83682d52fc15683d"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.459827 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lblzc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.464850 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.465073 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.465344 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.465570 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.465752 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.465914 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.468676 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfh8w" event={"ID":"71e51a04-4769-45b0-87b8-7292977ec73b","Type":"ContainerDied","Data":"709fc8cde417e90f16bfe3614835fd59368fb525a520a6bb2d0b1b1c1ba69892"} Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.468860 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfh8w" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.469493 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.469789 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.470542 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.471190 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.471556 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.471838 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.472124 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.511295 4733 scope.go:117] "RemoveContainer" containerID="99b96d833959bc8fc53eb6cb831d9ca359e10ca3d0e954298ff3cddabe32af9d" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.550508 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.550803 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.551228 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.551510 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.551773 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.552130 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.552386 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.552682 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.552860 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.553038 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.553361 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.553598 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.553863 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.554103 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.554173 4733 scope.go:117] "RemoveContainer" containerID="6f6dc328fdf6d56cdc5dfdada512ef0e2e95387ddd035d396b6ae5bc23f8af80" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.575881 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.576169 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.576404 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.576581 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.576804 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.577183 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.577583 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.577948 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.581469 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.581684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.581694 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.581877 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.582085 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.582359 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.582704 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.591670 4733 scope.go:117] "RemoveContainer" containerID="0be243c1fa38fcb2b039712c90d832a0632685afac32c9281d832d627166860f" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.604613 4733 scope.go:117] "RemoveContainer" containerID="41ce4cd04f112951baf91e6df3377f73f481e6681d4e6b232af29bce708fad52" Dec 06 05:47:21 crc kubenswrapper[4733]: W1206 05:47:21.614103 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-52fdaebe267dc340a2e1ee5f3bb7b98bcaeb6d8f940ca8a7f2e6f89ef43cc2de WatchSource:0}: Error finding container 52fdaebe267dc340a2e1ee5f3bb7b98bcaeb6d8f940ca8a7f2e6f89ef43cc2de: Status 404 returned error can't find the container with id 52fdaebe267dc340a2e1ee5f3bb7b98bcaeb6d8f940ca8a7f2e6f89ef43cc2de Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.623039 4733 scope.go:117] "RemoveContainer" containerID="a1aefe6069022502fe7dce9396c1bfa0706d1383349f3d35288077a28446b16c" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.637209 4733 scope.go:117] "RemoveContainer" containerID="ce8efb373b2a9d6229dc40f7cfec2885a85447019407c767a164b7240d6d62dc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.663986 4733 scope.go:117] "RemoveContainer" containerID="fe65f4b55b8e8ed93d424276f1fc06f31770302538e5122a5b09da36734d86dc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.689711 4733 scope.go:117] "RemoveContainer" containerID="072a74830979f13631869cb072cf4c3477922a1d292421e95083e90aa5ce3552" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.702408 4733 scope.go:117] "RemoveContainer" containerID="24ef956cf799d8ecb458474e0015925506bdab2f4ca6b7088f4a851418c3bd40" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.719382 4733 scope.go:117] "RemoveContainer" containerID="29fc6177faf3e1fdea9aa9761b1ec80081f7806be6fc060615acca600a2dd1dc" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.734294 4733 scope.go:117] "RemoveContainer" containerID="7a89af00601f2e8eaa39ceb69fc5035df1f45a4bcd8a7c93323a6bb802deeb9e" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.745357 4733 scope.go:117] "RemoveContainer" containerID="089490b3f20e36751dcc011def022641839f2dffca2ab773267a8c05afbf5344" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.762257 4733 scope.go:117] "RemoveContainer" containerID="64eea5f7e3630d6db27474607b2444bdf7ec0a67300a87ca213dd53f06a3a160" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.890004 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.890828 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.891215 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.891684 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.892031 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:21 crc kubenswrapper[4733]: I1206 05:47:21.892068 4733 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 06 05:47:21 crc kubenswrapper[4733]: E1206 05:47:21.892364 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="200ms" Dec 06 05:47:22 crc kubenswrapper[4733]: E1206 05:47:22.093233 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="400ms" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.477181 4733 generic.go:334] "Generic (PLEG): container finished" podID="b2eccc5b-1f33-4372-83e4-23e30f607d68" containerID="75359317360fe7229565c60d725c05642aa054c80378eb3f3f3576c05b344b45" exitCode=0 Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.477265 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2eccc5b-1f33-4372-83e4-23e30f607d68","Type":"ContainerDied","Data":"75359317360fe7229565c60d725c05642aa054c80378eb3f3f3576c05b344b45"} Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.477984 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.478390 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.478692 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.478930 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.479219 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.479543 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.479739 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.479970 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.480781 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.482546 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7c302e38a634570988acc4b3ff2cece0ef3b37b38b5365f441d438f1590856b4"} Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.482606 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"52fdaebe267dc340a2e1ee5f3bb7b98bcaeb6d8f940ca8a7f2e6f89ef43cc2de"} Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.483109 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: E1206 05:47:22.483112 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.211:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.483352 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.483574 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.483855 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.484249 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.484781 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.485048 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.485345 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.485501 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/1.log" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.486129 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/0.log" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.486188 4733 generic.go:334] "Generic (PLEG): container finished" podID="477154e1-6166-41c9-beb3-1248e1583324" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" exitCode=1 Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.486567 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.486700 4733 scope.go:117] "RemoveContainer" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" Dec 06 05:47:22 crc kubenswrapper[4733]: E1206 05:47:22.486905 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.486911 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.487293 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.487540 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.487809 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.488129 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.488440 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.488819 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.491107 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerDied","Data":"1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75"} Dec 06 05:47:22 crc kubenswrapper[4733]: I1206 05:47:22.491183 4733 scope.go:117] "RemoveContainer" containerID="df1cd4c9b9812c3e1e46a2c47fdfcc06252ad771ac3d2d1e61f345fa549c6942" Dec 06 05:47:22 crc kubenswrapper[4733]: E1206 05:47:22.493890 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="800ms" Dec 06 05:47:23 crc kubenswrapper[4733]: E1206 05:47:23.294748 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="1.6s" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.496767 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/1.log" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.497184 4733 scope.go:117] "RemoveContainer" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" Dec 06 05:47:23 crc kubenswrapper[4733]: E1206 05:47:23.497365 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.497851 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.498337 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.498563 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.498819 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.499076 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.499274 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.499542 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.500546 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.501204 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2" exitCode=0 Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.501282 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0364d45e97f68a4fad130e53ad9f658552a5fc212d58c09704ee13d0ca2726" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.542683 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.543649 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.544165 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.544495 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.544846 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.545131 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.545366 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.545627 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.545915 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.546231 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.656798 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.657333 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.657806 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.658231 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.658510 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.658785 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.659126 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.659464 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.659862 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.744990 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745059 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745153 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745216 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745255 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745283 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745437 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock\") pod \"b2eccc5b-1f33-4372-83e4-23e30f607d68\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.745548 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access\") pod \"b2eccc5b-1f33-4372-83e4-23e30f607d68\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.746859 4733 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.746879 4733 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.746915 4733 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.752907 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b2eccc5b-1f33-4372-83e4-23e30f607d68" (UID: "b2eccc5b-1f33-4372-83e4-23e30f607d68"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.752971 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock" (OuterVolumeSpecName: "var-lock") pod "b2eccc5b-1f33-4372-83e4-23e30f607d68" (UID: "b2eccc5b-1f33-4372-83e4-23e30f607d68"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.848146 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir\") pod \"b2eccc5b-1f33-4372-83e4-23e30f607d68\" (UID: \"b2eccc5b-1f33-4372-83e4-23e30f607d68\") " Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.848284 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b2eccc5b-1f33-4372-83e4-23e30f607d68" (UID: "b2eccc5b-1f33-4372-83e4-23e30f607d68"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.848528 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.848547 4733 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2eccc5b-1f33-4372-83e4-23e30f607d68-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:23 crc kubenswrapper[4733]: I1206 05:47:23.848556 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2eccc5b-1f33-4372-83e4-23e30f607d68-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.493934 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.507829 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.507808 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2eccc5b-1f33-4372-83e4-23e30f607d68","Type":"ContainerDied","Data":"73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15"} Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.507861 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.507884 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73b5dd3333e3d1170bcadaadb826142467620b7e4d546b5309150125a8521d15" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.508622 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.509144 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.509380 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.509974 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.511120 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.511410 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.511664 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.511969 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.521154 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.521522 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.522138 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.522445 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.522711 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.523030 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.523406 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.523660 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.524052 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.524489 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.524747 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.524925 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.525101 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.525357 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.525715 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: I1206 05:47:24.526015 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:24 crc kubenswrapper[4733]: E1206 05:47:24.895552 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="3.2s" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.486419 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.486719 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.486994 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.487265 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.487522 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.487769 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.487961 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: I1206 05:47:26.488187 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:26 crc kubenswrapper[4733]: E1206 05:47:26.883338 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/marketplace-operator-79b997595-xq44c.187e8a2f9ac43332\": dial tcp 192.168.25.211:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-xq44c.187e8a2f9ac43332 openshift-marketplace 29352 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-xq44c,UID:477154e1-6166-41c9-beb3-1248e1583324,APIVersion:v1,ResourceVersion:29321,FieldPath:spec.containers{marketplace-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 05:47:21 +0000 UTC,LastTimestamp:2025-12-06 05:47:21.440580351 +0000 UTC m=+225.305791463,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 05:47:28 crc kubenswrapper[4733]: E1206 05:47:28.096933 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="6.4s" Dec 06 05:47:30 crc kubenswrapper[4733]: I1206 05:47:30.580243 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:30 crc kubenswrapper[4733]: I1206 05:47:30.580624 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:30 crc kubenswrapper[4733]: I1206 05:47:30.581273 4733 scope.go:117] "RemoveContainer" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" Dec 06 05:47:30 crc kubenswrapper[4733]: E1206 05:47:30.581597 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.556858 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.557197 4733 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f" exitCode=1 Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.557256 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f"} Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.557768 4733 scope.go:117] "RemoveContainer" containerID="6cd2bcad3ce23a8998a578ecc373a4e8028eefab1e056cf1081eb2406ff9398f" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.557954 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.559465 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.559819 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.560204 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.560491 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.560871 4733 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.561221 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:33 crc kubenswrapper[4733]: I1206 05:47:33.561525 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.056168 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:47:34 crc kubenswrapper[4733]: E1206 05:47:34.497886 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.211:6443: connect: connection refused" interval="7s" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.566491 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.566563 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a7e51840fecb507bf3ed95096faf846ea4b5de8b8b450ed7a41801a846d477e"} Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.567263 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.567726 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.568091 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.568402 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.568673 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.568970 4733 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.569263 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:34 crc kubenswrapper[4733]: I1206 05:47:34.569512 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.484467 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.486894 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.487487 4733 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.487870 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.488274 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.488815 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.489253 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.489621 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.489944 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.490347 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.490631 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.490931 4733 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.491201 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.492811 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.493083 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.493352 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.493614 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.500537 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.500559 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:36 crc kubenswrapper[4733]: E1206 05:47:36.500869 4733 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.501248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:36 crc kubenswrapper[4733]: I1206 05:47:36.579700 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"848694b3bb498782b41f63c76d53fdbd969190678f8f798c4cd04fe69a0490ef"} Dec 06 05:47:36 crc kubenswrapper[4733]: E1206 05:47:36.884257 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/marketplace-operator-79b997595-xq44c.187e8a2f9ac43332\": dial tcp 192.168.25.211:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-xq44c.187e8a2f9ac43332 openshift-marketplace 29352 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-xq44c,UID:477154e1-6166-41c9-beb3-1248e1583324,APIVersion:v1,ResourceVersion:29321,FieldPath:spec.containers{marketplace-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 05:47:21 +0000 UTC,LastTimestamp:2025-12-06 05:47:21.440580351 +0000 UTC m=+225.305791463,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586213 4733 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f8b733a3e5f0053ec06dfc498d80d29d522ac38ad670c7e5aa5128543aa3c318" exitCode=0 Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586261 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f8b733a3e5f0053ec06dfc498d80d29d522ac38ad670c7e5aa5128543aa3c318"} Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586453 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586468 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586757 4733 status_manager.go:851] "Failed to get status for pod" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" pod="openshift-marketplace/community-operators-jfh8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jfh8w\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: E1206 05:47:37.586775 4733 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.586968 4733 status_manager.go:851] "Failed to get status for pod" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" pod="openshift-marketplace/certified-operators-lblzc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lblzc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.587196 4733 status_manager.go:851] "Failed to get status for pod" podUID="477154e1-6166-41c9-beb3-1248e1583324" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-xq44c\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.587437 4733 status_manager.go:851] "Failed to get status for pod" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" pod="openshift-marketplace/redhat-marketplace-tzdr7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tzdr7\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.587715 4733 status_manager.go:851] "Failed to get status for pod" podUID="76185470-be08-49f9-ab30-59314702bc08" pod="openshift-marketplace/marketplace-operator-79b997595-fbnvh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fbnvh\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.588365 4733 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.588609 4733 status_manager.go:851] "Failed to get status for pod" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:37 crc kubenswrapper[4733]: I1206 05:47:37.588791 4733 status_manager.go:851] "Failed to get status for pod" podUID="938df03e-d0d5-4b93-9a31-061262420f18" pod="openshift-marketplace/redhat-operators-bvjmk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bvjmk\": dial tcp 192.168.25.211:6443: connect: connection refused" Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.167010 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.594988 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b148e9dc55445285b4a6aa8761795043ece48483b322ba62279ecb81a46e6ded"} Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595048 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0a98a84b2845c8865a68cba16206a106d330558008c1f526ca40c41e1b0650b6"} Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595060 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f423cf69fd8b0b03434817db5fbf9ed2c773d80ba011e23f1096f61d163b9fcb"} Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5ac08efd25cf056ec9b92500c0e3d0c0c3c75cd5ca7c3e40c4f1677447baa616"} Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595084 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f1bd3b4faeb0e10d89995c17f4e8ef71d03882b1af459056bc323f20de63c17"} Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595186 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595331 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:38 crc kubenswrapper[4733]: I1206 05:47:38.595362 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:41 crc kubenswrapper[4733]: I1206 05:47:41.502056 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:41 crc kubenswrapper[4733]: I1206 05:47:41.502442 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:41 crc kubenswrapper[4733]: I1206 05:47:41.507090 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:43 crc kubenswrapper[4733]: I1206 05:47:43.868188 4733 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.056746 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.061263 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.646450 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.646557 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.646600 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:44 crc kubenswrapper[4733]: I1206 05:47:44.651693 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:47:45 crc kubenswrapper[4733]: I1206 05:47:45.640769 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:45 crc kubenswrapper[4733]: I1206 05:47:45.640804 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0700e329-54b6-4cfe-b2de-5cee58cf1aa5" Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.491516 4733 scope.go:117] "RemoveContainer" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.502791 4733 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="54a25232-a9b1-4fd9-a3b7-e96a317e055b" Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.652216 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/1.log" Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.652276 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerStarted","Data":"7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47"} Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.652646 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.654389 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xq44c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Dec 06 05:47:46 crc kubenswrapper[4733]: I1206 05:47:46.654441 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.659059 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/2.log" Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.660325 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/1.log" Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.660428 4733 generic.go:334] "Generic (PLEG): container finished" podID="477154e1-6166-41c9-beb3-1248e1583324" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" exitCode=1 Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.660500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerDied","Data":"7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47"} Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.660569 4733 scope.go:117] "RemoveContainer" containerID="1bf253de2c04cf844be0e80a846d0dc8382e5c655d8888cdb560bdeaa9fe4b75" Dec 06 05:47:47 crc kubenswrapper[4733]: I1206 05:47:47.661264 4733 scope.go:117] "RemoveContainer" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" Dec 06 05:47:47 crc kubenswrapper[4733]: E1206 05:47:47.661913 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:48 crc kubenswrapper[4733]: I1206 05:47:48.667838 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/2.log" Dec 06 05:47:48 crc kubenswrapper[4733]: I1206 05:47:48.668358 4733 scope.go:117] "RemoveContainer" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" Dec 06 05:47:48 crc kubenswrapper[4733]: E1206 05:47:48.668707 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:50 crc kubenswrapper[4733]: I1206 05:47:50.580071 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:47:50 crc kubenswrapper[4733]: I1206 05:47:50.580870 4733 scope.go:117] "RemoveContainer" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" Dec 06 05:47:50 crc kubenswrapper[4733]: E1206 05:47:50.582028 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.048435 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.205835 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.256060 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.430638 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.845276 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.862387 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 05:47:54 crc kubenswrapper[4733]: I1206 05:47:54.940607 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 05:47:55 crc kubenswrapper[4733]: I1206 05:47:55.524919 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 05:47:55 crc kubenswrapper[4733]: I1206 05:47:55.543921 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 05:47:55 crc kubenswrapper[4733]: I1206 05:47:55.602937 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 05:47:55 crc kubenswrapper[4733]: I1206 05:47:55.801364 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.005190 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.054282 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.061204 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.354518 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.458875 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.480911 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.488028 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.531028 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.677499 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.762523 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 05:47:56 crc kubenswrapper[4733]: I1206 05:47:56.896370 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.009586 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.010279 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.095639 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.170068 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.237282 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.264570 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.283357 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.454624 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.575357 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.593801 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.648537 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.715272 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.749368 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.771948 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.772026 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.783684 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 05:47:57 crc kubenswrapper[4733]: I1206 05:47:57.962938 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.024911 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.120948 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.150492 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.164492 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.300713 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.396718 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.401124 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.410041 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.420292 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.420919 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.435045 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.456425 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.502852 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.537038 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.537053 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.538789 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.581197 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.665536 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.694818 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.715893 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.726030 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.772315 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.880617 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 05:47:58 crc kubenswrapper[4733]: I1206 05:47:58.903685 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.003426 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.032470 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.040767 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.111386 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.113624 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.155139 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.241079 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.248791 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.260821 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.270939 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.335422 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.406275 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.498671 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.537222 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.612608 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.627060 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.628926 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.645531 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.650215 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.668408 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.697839 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.717147 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.959706 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.962672 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 05:47:59 crc kubenswrapper[4733]: I1206 05:47:59.970966 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.020995 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.122018 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.156896 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.175497 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.281951 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.294951 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.296893 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.331136 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.332616 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.375833 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.408970 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.446295 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.616871 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.637161 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.649124 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.657077 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 05:48:00 crc kubenswrapper[4733]: I1206 05:48:00.779208 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.031588 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.085426 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.330098 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.414218 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.450579 4733 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.606038 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.606545 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.612218 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.648484 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.656518 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.714232 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.793738 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.912441 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.955668 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 05:48:01 crc kubenswrapper[4733]: I1206 05:48:01.974464 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.047453 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.113771 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.127629 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.150275 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.170780 4733 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.174677 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jfh8w","openshift-marketplace/certified-operators-lblzc","openshift-marketplace/redhat-marketplace-tzdr7","openshift-marketplace/marketplace-operator-79b997595-fbnvh","openshift-marketplace/redhat-operators-bvjmk","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.174744 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.178358 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.188983 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.188974088 podStartE2EDuration="19.188974088s" podCreationTimestamp="2025-12-06 05:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:02.188136743 +0000 UTC m=+266.053347854" watchObservedRunningTime="2025-12-06 05:48:02.188974088 +0000 UTC m=+266.054185199" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.214842 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.300667 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.309570 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.335946 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.342906 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.416900 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.438510 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.459465 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.491699 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="139eeaf9-21f6-4032-9b24-73534a803ca5" path="/var/lib/kubelet/pods/139eeaf9-21f6-4032-9b24-73534a803ca5/volumes" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.492588 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e51a04-4769-45b0-87b8-7292977ec73b" path="/var/lib/kubelet/pods/71e51a04-4769-45b0-87b8-7292977ec73b/volumes" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.493286 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76185470-be08-49f9-ab30-59314702bc08" path="/var/lib/kubelet/pods/76185470-be08-49f9-ab30-59314702bc08/volumes" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.494273 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938df03e-d0d5-4b93-9a31-061262420f18" path="/var/lib/kubelet/pods/938df03e-d0d5-4b93-9a31-061262420f18/volumes" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.494903 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bfb477-5389-4827-91d8-cfd61ad2d5f8" path="/var/lib/kubelet/pods/b4bfb477-5389-4827-91d8-cfd61ad2d5f8/volumes" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.598529 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.817216 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.886842 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.975496 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.994251 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 05:48:02 crc kubenswrapper[4733]: I1206 05:48:02.995104 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.118689 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.131147 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.165437 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.209330 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.216493 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.235377 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.235575 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.254910 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.287883 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.391139 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.446808 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.478889 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.526071 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.536367 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.571241 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.653639 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.741256 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.759921 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.777108 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.872701 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 05:48:03 crc kubenswrapper[4733]: I1206 05:48:03.997869 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.066182 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.100168 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.148907 4733 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.360941 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.484833 4733 scope.go:117] "RemoveContainer" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" Dec 06 05:48:04 crc kubenswrapper[4733]: E1206 05:48:04.485117 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-xq44c_openshift-marketplace(477154e1-6166-41c9-beb3-1248e1583324)\"" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.819701 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.864589 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.887069 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.893038 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 05:48:04 crc kubenswrapper[4733]: I1206 05:48:04.988280 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.086002 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.244380 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.265108 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.281339 4733 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.281562 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7c302e38a634570988acc4b3ff2cece0ef3b37b38b5365f441d438f1590856b4" gracePeriod=5 Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.295925 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.328945 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.340745 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.340979 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.357726 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.439695 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.485547 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.485930 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.565830 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.675166 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.739757 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.767393 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.834462 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.852842 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 05:48:05 crc kubenswrapper[4733]: I1206 05:48:05.988430 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.074098 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.084928 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.086554 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.090598 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.101875 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.119116 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.128159 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.134638 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.188019 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.255797 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.260107 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.281488 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.338715 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.375175 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.414942 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.476344 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.512857 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.561853 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.599069 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.698173 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.743087 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.760082 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.789421 4733 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.882122 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.918709 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 05:48:06 crc kubenswrapper[4733]: I1206 05:48:06.929120 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.024472 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.126331 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.239171 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.279930 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.351679 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.645985 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.679647 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.822461 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.840630 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 05:48:07 crc kubenswrapper[4733]: I1206 05:48:07.987478 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.093635 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.153798 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.274085 4733 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.530965 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.556296 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 05:48:08 crc kubenswrapper[4733]: I1206 05:48:08.587738 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.065749 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.089738 4733 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.176208 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.403987 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.474010 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 05:48:09 crc kubenswrapper[4733]: I1206 05:48:09.977081 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 05:48:10 crc kubenswrapper[4733]: I1206 05:48:10.211179 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 05:48:10 crc kubenswrapper[4733]: I1206 05:48:10.781378 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 05:48:10 crc kubenswrapper[4733]: I1206 05:48:10.781782 4733 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7c302e38a634570988acc4b3ff2cece0ef3b37b38b5365f441d438f1590856b4" exitCode=137 Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.593211 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.593294 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689267 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689475 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689597 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689734 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689916 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689408 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689540 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689788 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.689939 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.690540 4733 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.690617 4733 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.690677 4733 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.690736 4733 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.698506 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.788611 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.788682 4733 scope.go:117] "RemoveContainer" containerID="7c302e38a634570988acc4b3ff2cece0ef3b37b38b5365f441d438f1590856b4" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.788748 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 05:48:11 crc kubenswrapper[4733]: I1206 05:48:11.791157 4733 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:12 crc kubenswrapper[4733]: I1206 05:48:12.490153 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.485114 4733 scope.go:117] "RemoveContainer" containerID="7604b6a686002ec721651f3eb90e24f6e6483a54e3fa53117b92c3a8fb1d3a47" Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.813451 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/2.log" Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.813774 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" event={"ID":"477154e1-6166-41c9-beb3-1248e1583324","Type":"ContainerStarted","Data":"278eb6cc1d3a219d5eda2fa6b3ad3a368eaaa443d14148a7a578bdbc176bb24f"} Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.814185 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.815005 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xq44c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.815057 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podUID="477154e1-6166-41c9-beb3-1248e1583324" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Dec 06 05:48:15 crc kubenswrapper[4733]: I1206 05:48:15.831440 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" podStartSLOduration=55.831428185 podStartE2EDuration="55.831428185s" podCreationTimestamp="2025-12-06 05:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:47:46.668663647 +0000 UTC m=+250.533874757" watchObservedRunningTime="2025-12-06 05:48:15.831428185 +0000 UTC m=+279.696639297" Dec 06 05:48:16 crc kubenswrapper[4733]: I1206 05:48:16.823051 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xq44c" Dec 06 05:48:23 crc kubenswrapper[4733]: I1206 05:48:23.577158 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 05:48:26 crc kubenswrapper[4733]: I1206 05:48:26.206481 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 05:48:26 crc kubenswrapper[4733]: I1206 05:48:26.507613 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 05:48:27 crc kubenswrapper[4733]: I1206 05:48:27.021406 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 05:48:28 crc kubenswrapper[4733]: I1206 05:48:28.581245 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 05:48:31 crc kubenswrapper[4733]: I1206 05:48:31.567202 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.158923 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.159201 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" podUID="bb0fb709-5a66-42a8-aad4-c405502ce542" containerName="controller-manager" containerID="cri-o://c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db" gracePeriod=30 Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.258381 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.258744 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" podUID="197ee618-405f-4f94-a618-da74488f0d23" containerName="route-controller-manager" containerID="cri-o://e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b" gracePeriod=30 Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.460023 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.555383 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.622969 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert\") pod \"bb0fb709-5a66-42a8-aad4-c405502ce542\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.623101 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca\") pod \"bb0fb709-5a66-42a8-aad4-c405502ce542\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.623190 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles\") pod \"bb0fb709-5a66-42a8-aad4-c405502ce542\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.623244 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkvp8\" (UniqueName: \"kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8\") pod \"bb0fb709-5a66-42a8-aad4-c405502ce542\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.623267 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config\") pod \"bb0fb709-5a66-42a8-aad4-c405502ce542\" (UID: \"bb0fb709-5a66-42a8-aad4-c405502ce542\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.624196 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bb0fb709-5a66-42a8-aad4-c405502ce542" (UID: "bb0fb709-5a66-42a8-aad4-c405502ce542"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.624545 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb0fb709-5a66-42a8-aad4-c405502ce542" (UID: "bb0fb709-5a66-42a8-aad4-c405502ce542"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.624647 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config" (OuterVolumeSpecName: "config") pod "bb0fb709-5a66-42a8-aad4-c405502ce542" (UID: "bb0fb709-5a66-42a8-aad4-c405502ce542"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.629064 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb0fb709-5a66-42a8-aad4-c405502ce542" (UID: "bb0fb709-5a66-42a8-aad4-c405502ce542"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.629241 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8" (OuterVolumeSpecName: "kube-api-access-qkvp8") pod "bb0fb709-5a66-42a8-aad4-c405502ce542" (UID: "bb0fb709-5a66-42a8-aad4-c405502ce542"). InnerVolumeSpecName "kube-api-access-qkvp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4km5q\" (UniqueName: \"kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q\") pod \"197ee618-405f-4f94-a618-da74488f0d23\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725408 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert\") pod \"197ee618-405f-4f94-a618-da74488f0d23\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725444 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config\") pod \"197ee618-405f-4f94-a618-da74488f0d23\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725486 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca\") pod \"197ee618-405f-4f94-a618-da74488f0d23\" (UID: \"197ee618-405f-4f94-a618-da74488f0d23\") " Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725782 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0fb709-5a66-42a8-aad4-c405502ce542-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725795 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725806 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725817 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkvp8\" (UniqueName: \"kubernetes.io/projected/bb0fb709-5a66-42a8-aad4-c405502ce542-kube-api-access-qkvp8\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.725827 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0fb709-5a66-42a8-aad4-c405502ce542-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.726159 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config" (OuterVolumeSpecName: "config") pod "197ee618-405f-4f94-a618-da74488f0d23" (UID: "197ee618-405f-4f94-a618-da74488f0d23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.726193 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca" (OuterVolumeSpecName: "client-ca") pod "197ee618-405f-4f94-a618-da74488f0d23" (UID: "197ee618-405f-4f94-a618-da74488f0d23"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.729757 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "197ee618-405f-4f94-a618-da74488f0d23" (UID: "197ee618-405f-4f94-a618-da74488f0d23"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.729997 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q" (OuterVolumeSpecName: "kube-api-access-4km5q") pod "197ee618-405f-4f94-a618-da74488f0d23" (UID: "197ee618-405f-4f94-a618-da74488f0d23"). InnerVolumeSpecName "kube-api-access-4km5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.827331 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.827355 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197ee618-405f-4f94-a618-da74488f0d23-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.827368 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4km5q\" (UniqueName: \"kubernetes.io/projected/197ee618-405f-4f94-a618-da74488f0d23-kube-api-access-4km5q\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.827380 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197ee618-405f-4f94-a618-da74488f0d23-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.902531 4733 generic.go:334] "Generic (PLEG): container finished" podID="197ee618-405f-4f94-a618-da74488f0d23" containerID="e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b" exitCode=0 Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.902610 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" event={"ID":"197ee618-405f-4f94-a618-da74488f0d23","Type":"ContainerDied","Data":"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b"} Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.902654 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" event={"ID":"197ee618-405f-4f94-a618-da74488f0d23","Type":"ContainerDied","Data":"1179cb4b594b5ea8f8010b670e768c79f3a339a0fbd543b3caca6620a99711f2"} Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.902683 4733 scope.go:117] "RemoveContainer" containerID="e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.902827 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.905182 4733 generic.go:334] "Generic (PLEG): container finished" podID="bb0fb709-5a66-42a8-aad4-c405502ce542" containerID="c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db" exitCode=0 Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.905218 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" event={"ID":"bb0fb709-5a66-42a8-aad4-c405502ce542","Type":"ContainerDied","Data":"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db"} Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.905238 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" event={"ID":"bb0fb709-5a66-42a8-aad4-c405502ce542","Type":"ContainerDied","Data":"f2df924a5b34baec6c589c50a5142bbdd939dc160b2c1446d9c8885693fe6924"} Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.905244 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpw8l" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.921627 4733 scope.go:117] "RemoveContainer" containerID="e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b" Dec 06 05:48:32 crc kubenswrapper[4733]: E1206 05:48:32.922181 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b\": container with ID starting with e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b not found: ID does not exist" containerID="e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.922220 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b"} err="failed to get container status \"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b\": rpc error: code = NotFound desc = could not find container \"e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b\": container with ID starting with e294653a23e8d18b8ef6ae2c3e5f1be65b24de905209163449b9e78cc7b31e8b not found: ID does not exist" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.922247 4733 scope.go:117] "RemoveContainer" containerID="c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.933529 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.936217 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fx652"] Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.938079 4733 scope.go:117] "RemoveContainer" containerID="c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db" Dec 06 05:48:32 crc kubenswrapper[4733]: E1206 05:48:32.939177 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db\": container with ID starting with c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db not found: ID does not exist" containerID="c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.939215 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db"} err="failed to get container status \"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db\": rpc error: code = NotFound desc = could not find container \"c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db\": container with ID starting with c273b5dfc7dc7e2158e274d68422ca3cfaeb85c6520c585738a6e88df138e8db not found: ID does not exist" Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.941514 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:48:32 crc kubenswrapper[4733]: I1206 05:48:32.943919 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpw8l"] Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195571 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:33 crc kubenswrapper[4733]: E1206 05:48:33.195746 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ee618-405f-4f94-a618-da74488f0d23" containerName="route-controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195761 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ee618-405f-4f94-a618-da74488f0d23" containerName="route-controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: E1206 05:48:33.195774 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0fb709-5a66-42a8-aad4-c405502ce542" containerName="controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195780 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0fb709-5a66-42a8-aad4-c405502ce542" containerName="controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: E1206 05:48:33.195794 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195802 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 05:48:33 crc kubenswrapper[4733]: E1206 05:48:33.195809 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" containerName="installer" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195815 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" containerName="installer" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195889 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="197ee618-405f-4f94-a618-da74488f0d23" containerName="route-controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195900 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2eccc5b-1f33-4372-83e4-23e30f607d68" containerName="installer" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195910 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0fb709-5a66-42a8-aad4-c405502ce542" containerName="controller-manager" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.195920 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.196248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.197979 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.198146 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.198247 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.198611 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.198863 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.200220 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.203994 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.205383 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.334695 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.334879 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.334938 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.334977 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhx6\" (UniqueName: \"kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.335199 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.436389 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.436444 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.436471 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhx6\" (UniqueName: \"kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.436506 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.436540 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.437471 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.438001 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.438054 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.441047 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.451406 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhx6\" (UniqueName: \"kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6\") pod \"controller-manager-64f4b76bd8-t7hs7\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.507638 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.870323 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:33 crc kubenswrapper[4733]: I1206 05:48:33.912331 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" event={"ID":"8cddd8b4-f90f-4659-aa19-ab58dc5bd910","Type":"ContainerStarted","Data":"bb2270a6889cad16541679a1ff08573c9d07d9f8358dd2c4ddb37ecfcc934802"} Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.192705 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.193462 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.195580 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.196030 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.196997 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.197034 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.197827 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.198385 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.203017 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.349835 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xdh4\" (UniqueName: \"kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.349923 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.350066 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.350174 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.451107 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.451171 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xdh4\" (UniqueName: \"kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.451232 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.451256 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.452538 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.452565 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.459720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.471612 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xdh4\" (UniqueName: \"kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4\") pod \"route-controller-manager-c59b74f88-w7kb8\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.491241 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197ee618-405f-4f94-a618-da74488f0d23" path="/var/lib/kubelet/pods/197ee618-405f-4f94-a618-da74488f0d23/volumes" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.492338 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0fb709-5a66-42a8-aad4-c405502ce542" path="/var/lib/kubelet/pods/bb0fb709-5a66-42a8-aad4-c405502ce542/volumes" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.506825 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.668407 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:34 crc kubenswrapper[4733]: W1206 05:48:34.674622 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd3eb9f8_3172_460c_b45d_972701423497.slice/crio-4b1a4ba86e6e347fcad0dd4403fd0d8a70672180419ab8898a4a292dac241eef WatchSource:0}: Error finding container 4b1a4ba86e6e347fcad0dd4403fd0d8a70672180419ab8898a4a292dac241eef: Status 404 returned error can't find the container with id 4b1a4ba86e6e347fcad0dd4403fd0d8a70672180419ab8898a4a292dac241eef Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.922703 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" event={"ID":"bd3eb9f8-3172-460c-b45d-972701423497","Type":"ContainerStarted","Data":"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf"} Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.922756 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" event={"ID":"bd3eb9f8-3172-460c-b45d-972701423497","Type":"ContainerStarted","Data":"4b1a4ba86e6e347fcad0dd4403fd0d8a70672180419ab8898a4a292dac241eef"} Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.923023 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.925363 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" event={"ID":"8cddd8b4-f90f-4659-aa19-ab58dc5bd910","Type":"ContainerStarted","Data":"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832"} Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.925589 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.930350 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.943678 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" podStartSLOduration=2.943629241 podStartE2EDuration="2.943629241s" podCreationTimestamp="2025-12-06 05:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:34.941292533 +0000 UTC m=+298.806503644" watchObservedRunningTime="2025-12-06 05:48:34.943629241 +0000 UTC m=+298.808840352" Dec 06 05:48:34 crc kubenswrapper[4733]: I1206 05:48:34.958418 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" podStartSLOduration=2.958403282 podStartE2EDuration="2.958403282s" podCreationTimestamp="2025-12-06 05:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:34.955470092 +0000 UTC m=+298.820681203" watchObservedRunningTime="2025-12-06 05:48:34.958403282 +0000 UTC m=+298.823614383" Dec 06 05:48:35 crc kubenswrapper[4733]: I1206 05:48:35.088500 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:35 crc kubenswrapper[4733]: I1206 05:48:35.104646 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:35 crc kubenswrapper[4733]: I1206 05:48:35.312629 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:36 crc kubenswrapper[4733]: I1206 05:48:36.941114 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" podUID="bd3eb9f8-3172-460c-b45d-972701423497" containerName="route-controller-manager" containerID="cri-o://808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf" gracePeriod=30 Dec 06 05:48:36 crc kubenswrapper[4733]: I1206 05:48:36.942634 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" podUID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" containerName="controller-manager" containerID="cri-o://b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832" gracePeriod=30 Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.322639 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.326754 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.348031 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:37 crc kubenswrapper[4733]: E1206 05:48:37.348442 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" containerName="controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.348460 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" containerName="controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: E1206 05:48:37.348470 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3eb9f8-3172-460c-b45d-972701423497" containerName="route-controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.348480 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3eb9f8-3172-460c-b45d-972701423497" containerName="route-controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.348605 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" containerName="controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.348622 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3eb9f8-3172-460c-b45d-972701423497" containerName="route-controller-manager" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.349251 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.369816 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.493851 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert\") pod \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.493931 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xdh4\" (UniqueName: \"kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4\") pod \"bd3eb9f8-3172-460c-b45d-972701423497\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.493983 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqhx6\" (UniqueName: \"kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6\") pod \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494023 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config\") pod \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494153 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca\") pod \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert\") pod \"bd3eb9f8-3172-460c-b45d-972701423497\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494196 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca\") pod \"bd3eb9f8-3172-460c-b45d-972701423497\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494222 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config\") pod \"bd3eb9f8-3172-460c-b45d-972701423497\" (UID: \"bd3eb9f8-3172-460c-b45d-972701423497\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494258 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles\") pod \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\" (UID: \"8cddd8b4-f90f-4659-aa19-ab58dc5bd910\") " Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494513 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494606 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4cgk\" (UniqueName: \"kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.494635 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.495085 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cddd8b4-f90f-4659-aa19-ab58dc5bd910" (UID: "8cddd8b4-f90f-4659-aa19-ab58dc5bd910"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.495192 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8cddd8b4-f90f-4659-aa19-ab58dc5bd910" (UID: "8cddd8b4-f90f-4659-aa19-ab58dc5bd910"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.495422 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config" (OuterVolumeSpecName: "config") pod "8cddd8b4-f90f-4659-aa19-ab58dc5bd910" (UID: "8cddd8b4-f90f-4659-aa19-ab58dc5bd910"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.495401 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd3eb9f8-3172-460c-b45d-972701423497" (UID: "bd3eb9f8-3172-460c-b45d-972701423497"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.495497 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config" (OuterVolumeSpecName: "config") pod "bd3eb9f8-3172-460c-b45d-972701423497" (UID: "bd3eb9f8-3172-460c-b45d-972701423497"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.500252 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6" (OuterVolumeSpecName: "kube-api-access-kqhx6") pod "8cddd8b4-f90f-4659-aa19-ab58dc5bd910" (UID: "8cddd8b4-f90f-4659-aa19-ab58dc5bd910"). InnerVolumeSpecName "kube-api-access-kqhx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.500550 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4" (OuterVolumeSpecName: "kube-api-access-8xdh4") pod "bd3eb9f8-3172-460c-b45d-972701423497" (UID: "bd3eb9f8-3172-460c-b45d-972701423497"). InnerVolumeSpecName "kube-api-access-8xdh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.500952 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd3eb9f8-3172-460c-b45d-972701423497" (UID: "bd3eb9f8-3172-460c-b45d-972701423497"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.501056 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cddd8b4-f90f-4659-aa19-ab58dc5bd910" (UID: "8cddd8b4-f90f-4659-aa19-ab58dc5bd910"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.595799 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.595983 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4cgk\" (UniqueName: \"kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.596117 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.596777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.596796 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597029 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3eb9f8-3172-460c-b45d-972701423497-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597115 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597173 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597223 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3eb9f8-3172-460c-b45d-972701423497-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597286 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597412 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597483 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xdh4\" (UniqueName: \"kubernetes.io/projected/bd3eb9f8-3172-460c-b45d-972701423497-kube-api-access-8xdh4\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597543 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqhx6\" (UniqueName: \"kubernetes.io/projected/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-kube-api-access-kqhx6\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597600 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cddd8b4-f90f-4659-aa19-ab58dc5bd910-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.597373 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.600117 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.611321 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4cgk\" (UniqueName: \"kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk\") pod \"route-controller-manager-657648d9fc-m2wkj\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.664866 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.826413 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.946554 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" event={"ID":"76016c66-7378-49d9-a32d-91ec0bb616d3","Type":"ContainerStarted","Data":"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.946596 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" event={"ID":"76016c66-7378-49d9-a32d-91ec0bb616d3","Type":"ContainerStarted","Data":"6fd3934d205b1c2d995eed855ddf2d47aaba025af589a8a0fc14f589ce9f182f"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.946816 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.947850 4733 patch_prober.go:28] interesting pod/route-controller-manager-657648d9fc-m2wkj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.948085 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.947955 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" event={"ID":"8cddd8b4-f90f-4659-aa19-ab58dc5bd910","Type":"ContainerDied","Data":"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.947932 4733 generic.go:334] "Generic (PLEG): container finished" podID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" containerID="b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832" exitCode=0 Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.948153 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" event={"ID":"8cddd8b4-f90f-4659-aa19-ab58dc5bd910","Type":"ContainerDied","Data":"bb2270a6889cad16541679a1ff08573c9d07d9f8358dd2c4ddb37ecfcc934802"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.948002 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.948179 4733 scope.go:117] "RemoveContainer" containerID="b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.951802 4733 generic.go:334] "Generic (PLEG): container finished" podID="bd3eb9f8-3172-460c-b45d-972701423497" containerID="808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf" exitCode=0 Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.951841 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" event={"ID":"bd3eb9f8-3172-460c-b45d-972701423497","Type":"ContainerDied","Data":"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.951848 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.951864 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8" event={"ID":"bd3eb9f8-3172-460c-b45d-972701423497","Type":"ContainerDied","Data":"4b1a4ba86e6e347fcad0dd4403fd0d8a70672180419ab8898a4a292dac241eef"} Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.960877 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" podStartSLOduration=2.96085415 podStartE2EDuration="2.96085415s" podCreationTimestamp="2025-12-06 05:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:37.958094698 +0000 UTC m=+301.823305808" watchObservedRunningTime="2025-12-06 05:48:37.96085415 +0000 UTC m=+301.826065261" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.962966 4733 scope.go:117] "RemoveContainer" containerID="b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832" Dec 06 05:48:37 crc kubenswrapper[4733]: E1206 05:48:37.963580 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832\": container with ID starting with b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832 not found: ID does not exist" containerID="b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.963610 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832"} err="failed to get container status \"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832\": rpc error: code = NotFound desc = could not find container \"b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832\": container with ID starting with b7ccf458fdc5fbd44aaf52e791ab36f443683589f9c224d89bb40368f8fcc832 not found: ID does not exist" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.963635 4733 scope.go:117] "RemoveContainer" containerID="808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.977457 4733 scope.go:117] "RemoveContainer" containerID="808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf" Dec 06 05:48:37 crc kubenswrapper[4733]: E1206 05:48:37.977739 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf\": container with ID starting with 808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf not found: ID does not exist" containerID="808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.977766 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.977766 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf"} err="failed to get container status \"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf\": rpc error: code = NotFound desc = could not find container \"808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf\": container with ID starting with 808fa839924bfe6d83f97ad032ef941d5558a1095a5958cd25b4a118f8ee1cbf not found: ID does not exist" Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.981439 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-t7hs7"] Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.984582 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:37 crc kubenswrapper[4733]: I1206 05:48:37.986694 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-w7kb8"] Dec 06 05:48:38 crc kubenswrapper[4733]: I1206 05:48:38.491600 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cddd8b4-f90f-4659-aa19-ab58dc5bd910" path="/var/lib/kubelet/pods/8cddd8b4-f90f-4659-aa19-ab58dc5bd910/volumes" Dec 06 05:48:38 crc kubenswrapper[4733]: I1206 05:48:38.492096 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3eb9f8-3172-460c-b45d-972701423497" path="/var/lib/kubelet/pods/bd3eb9f8-3172-460c-b45d-972701423497/volumes" Dec 06 05:48:38 crc kubenswrapper[4733]: I1206 05:48:38.965510 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:39 crc kubenswrapper[4733]: I1206 05:48:39.833553 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.202801 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.203994 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.207471 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.207538 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.207630 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.207635 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.207638 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.210884 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.215693 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.215808 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.333926 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.334008 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.334230 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.334274 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bc5c\" (UniqueName: \"kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.334419 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.435725 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.435833 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.435867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.435905 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.435929 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bc5c\" (UniqueName: \"kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.437423 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.437423 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.437627 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.442292 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.452853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bc5c\" (UniqueName: \"kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c\") pod \"controller-manager-5f784d6689-h694b\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.519463 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.672950 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.974782 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" event={"ID":"784bb05e-7593-4b25-88da-7131c4f89466","Type":"ContainerStarted","Data":"1ab80b089bbe7ed44d82b5a18582b983b0e7f2124e1335cf596594b4346e9449"} Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.974840 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" event={"ID":"784bb05e-7593-4b25-88da-7131c4f89466","Type":"ContainerStarted","Data":"5bb8b0bcabad950581edb1382178a112ac220d818e265737bf74ee411020eedf"} Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.974995 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.988362 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" podStartSLOduration=5.988350658 podStartE2EDuration="5.988350658s" podCreationTimestamp="2025-12-06 05:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:40.985971621 +0000 UTC m=+304.851182733" watchObservedRunningTime="2025-12-06 05:48:40.988350658 +0000 UTC m=+304.853561769" Dec 06 05:48:40 crc kubenswrapper[4733]: I1206 05:48:40.989146 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:48:41 crc kubenswrapper[4733]: I1206 05:48:41.288355 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 05:48:43 crc kubenswrapper[4733]: I1206 05:48:43.310864 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 05:48:44 crc kubenswrapper[4733]: I1206 05:48:44.710940 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 05:48:48 crc kubenswrapper[4733]: I1206 05:48:48.937772 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 05:48:49 crc kubenswrapper[4733]: I1206 05:48:49.087817 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.163946 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.164225 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerName="route-controller-manager" containerID="cri-o://df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273" gracePeriod=30 Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.552280 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.693330 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca\") pod \"76016c66-7378-49d9-a32d-91ec0bb616d3\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.693450 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4cgk\" (UniqueName: \"kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk\") pod \"76016c66-7378-49d9-a32d-91ec0bb616d3\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.693524 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config\") pod \"76016c66-7378-49d9-a32d-91ec0bb616d3\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.693620 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert\") pod \"76016c66-7378-49d9-a32d-91ec0bb616d3\" (UID: \"76016c66-7378-49d9-a32d-91ec0bb616d3\") " Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.694044 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "76016c66-7378-49d9-a32d-91ec0bb616d3" (UID: "76016c66-7378-49d9-a32d-91ec0bb616d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.694161 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config" (OuterVolumeSpecName: "config") pod "76016c66-7378-49d9-a32d-91ec0bb616d3" (UID: "76016c66-7378-49d9-a32d-91ec0bb616d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.698919 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk" (OuterVolumeSpecName: "kube-api-access-w4cgk") pod "76016c66-7378-49d9-a32d-91ec0bb616d3" (UID: "76016c66-7378-49d9-a32d-91ec0bb616d3"). InnerVolumeSpecName "kube-api-access-w4cgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.699464 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76016c66-7378-49d9-a32d-91ec0bb616d3" (UID: "76016c66-7378-49d9-a32d-91ec0bb616d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.795220 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76016c66-7378-49d9-a32d-91ec0bb616d3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.795265 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.795282 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4cgk\" (UniqueName: \"kubernetes.io/projected/76016c66-7378-49d9-a32d-91ec0bb616d3-kube-api-access-w4cgk\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:52 crc kubenswrapper[4733]: I1206 05:48:52.795297 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76016c66-7378-49d9-a32d-91ec0bb616d3-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.047922 4733 generic.go:334] "Generic (PLEG): container finished" podID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerID="df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273" exitCode=0 Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.047973 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.047974 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" event={"ID":"76016c66-7378-49d9-a32d-91ec0bb616d3","Type":"ContainerDied","Data":"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273"} Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.048024 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj" event={"ID":"76016c66-7378-49d9-a32d-91ec0bb616d3","Type":"ContainerDied","Data":"6fd3934d205b1c2d995eed855ddf2d47aaba025af589a8a0fc14f589ce9f182f"} Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.048042 4733 scope.go:117] "RemoveContainer" containerID="df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.065551 4733 scope.go:117] "RemoveContainer" containerID="df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273" Dec 06 05:48:53 crc kubenswrapper[4733]: E1206 05:48:53.066441 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273\": container with ID starting with df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273 not found: ID does not exist" containerID="df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.066476 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273"} err="failed to get container status \"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273\": rpc error: code = NotFound desc = could not find container \"df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273\": container with ID starting with df3e13e4bde5cf56251e79b01f0ab8ec9c1807d01a96f2b24069d42b77f2d273 not found: ID does not exist" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.070845 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.073264 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-m2wkj"] Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.206949 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89"] Dec 06 05:48:53 crc kubenswrapper[4733]: E1206 05:48:53.207208 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerName="route-controller-manager" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.207221 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerName="route-controller-manager" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.207388 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" containerName="route-controller-manager" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.207779 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.209261 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.209346 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.209618 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.210327 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.210389 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.210780 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.216222 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89"] Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.302522 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-client-ca\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.302584 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-config\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.302655 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d7b7e37-bf09-4729-a029-99d37514defa-serving-cert\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.302685 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jwt\" (UniqueName: \"kubernetes.io/projected/7d7b7e37-bf09-4729-a029-99d37514defa-kube-api-access-87jwt\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.403825 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-client-ca\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.403880 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-config\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.403916 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d7b7e37-bf09-4729-a029-99d37514defa-serving-cert\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.403940 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jwt\" (UniqueName: \"kubernetes.io/projected/7d7b7e37-bf09-4729-a029-99d37514defa-kube-api-access-87jwt\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.404840 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-client-ca\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.405266 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d7b7e37-bf09-4729-a029-99d37514defa-config\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.408700 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d7b7e37-bf09-4729-a029-99d37514defa-serving-cert\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.418904 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jwt\" (UniqueName: \"kubernetes.io/projected/7d7b7e37-bf09-4729-a029-99d37514defa-kube-api-access-87jwt\") pod \"route-controller-manager-6857bc9b8-cbk89\" (UID: \"7d7b7e37-bf09-4729-a029-99d37514defa\") " pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.520348 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:53 crc kubenswrapper[4733]: I1206 05:48:53.885968 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89"] Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.055718 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" event={"ID":"7d7b7e37-bf09-4729-a029-99d37514defa","Type":"ContainerStarted","Data":"cececa1012c3aa55cd07050baa951fe2db367eefde52202ea2c44cadd998d5b5"} Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.055763 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" event={"ID":"7d7b7e37-bf09-4729-a029-99d37514defa","Type":"ContainerStarted","Data":"10fff9edfe6fc376bd34b7cd8a30e7feb7076d9ef45f48e19efa6d23ffd54f68"} Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.056536 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.072598 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" podStartSLOduration=2.072583643 podStartE2EDuration="2.072583643s" podCreationTimestamp="2025-12-06 05:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:48:54.070585734 +0000 UTC m=+317.935796844" watchObservedRunningTime="2025-12-06 05:48:54.072583643 +0000 UTC m=+317.937794754" Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.490819 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76016c66-7378-49d9-a32d-91ec0bb616d3" path="/var/lib/kubelet/pods/76016c66-7378-49d9-a32d-91ec0bb616d3/volumes" Dec 06 05:48:54 crc kubenswrapper[4733]: I1206 05:48:54.531913 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6857bc9b8-cbk89" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.564351 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76k6r"] Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.565901 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.567807 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.570903 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76k6r"] Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.694236 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsp4\" (UniqueName: \"kubernetes.io/projected/87626d39-c79f-487c-819c-95eec3d5d5a3-kube-api-access-mhsp4\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.694284 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-catalog-content\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.694374 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-utilities\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.762035 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tzhq7"] Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.763048 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.766834 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.770139 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzhq7"] Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796087 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-catalog-content\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796154 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-catalog-content\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796214 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nljh\" (UniqueName: \"kubernetes.io/projected/f4be7c3a-dabf-4f6d-8488-17f680198610-kube-api-access-4nljh\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796237 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-utilities\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796360 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-utilities\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsp4\" (UniqueName: \"kubernetes.io/projected/87626d39-c79f-487c-819c-95eec3d5d5a3-kube-api-access-mhsp4\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796492 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-catalog-content\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.796525 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87626d39-c79f-487c-819c-95eec3d5d5a3-utilities\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.812900 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsp4\" (UniqueName: \"kubernetes.io/projected/87626d39-c79f-487c-819c-95eec3d5d5a3-kube-api-access-mhsp4\") pod \"redhat-marketplace-76k6r\" (UID: \"87626d39-c79f-487c-819c-95eec3d5d5a3\") " pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.885473 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.897324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nljh\" (UniqueName: \"kubernetes.io/projected/f4be7c3a-dabf-4f6d-8488-17f680198610-kube-api-access-4nljh\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.897469 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-utilities\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.897592 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-catalog-content\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.897822 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-utilities\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.897966 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4be7c3a-dabf-4f6d-8488-17f680198610-catalog-content\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:09 crc kubenswrapper[4733]: I1206 05:49:09.911126 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nljh\" (UniqueName: \"kubernetes.io/projected/f4be7c3a-dabf-4f6d-8488-17f680198610-kube-api-access-4nljh\") pod \"redhat-operators-tzhq7\" (UID: \"f4be7c3a-dabf-4f6d-8488-17f680198610\") " pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:10 crc kubenswrapper[4733]: I1206 05:49:10.077136 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:10 crc kubenswrapper[4733]: I1206 05:49:10.258046 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76k6r"] Dec 06 05:49:10 crc kubenswrapper[4733]: W1206 05:49:10.261712 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87626d39_c79f_487c_819c_95eec3d5d5a3.slice/crio-6ecfdab10dec44d6ea20a455509f221eefecdd475f5d42b885b3390976b83419 WatchSource:0}: Error finding container 6ecfdab10dec44d6ea20a455509f221eefecdd475f5d42b885b3390976b83419: Status 404 returned error can't find the container with id 6ecfdab10dec44d6ea20a455509f221eefecdd475f5d42b885b3390976b83419 Dec 06 05:49:10 crc kubenswrapper[4733]: I1206 05:49:10.437267 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzhq7"] Dec 06 05:49:10 crc kubenswrapper[4733]: W1206 05:49:10.442072 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4be7c3a_dabf_4f6d_8488_17f680198610.slice/crio-d7b962bd1aa34d37a0ecdcfa0f3c04b6db3e5b1bb5eb87fb955c9fd8f5ff83b3 WatchSource:0}: Error finding container d7b962bd1aa34d37a0ecdcfa0f3c04b6db3e5b1bb5eb87fb955c9fd8f5ff83b3: Status 404 returned error can't find the container with id d7b962bd1aa34d37a0ecdcfa0f3c04b6db3e5b1bb5eb87fb955c9fd8f5ff83b3 Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.151191 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4be7c3a-dabf-4f6d-8488-17f680198610" containerID="e29b27675ffba6d5ab8b7a2b6fa306a6f472f66d1890dd83e192ac57db33ae0f" exitCode=0 Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.151234 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzhq7" event={"ID":"f4be7c3a-dabf-4f6d-8488-17f680198610","Type":"ContainerDied","Data":"e29b27675ffba6d5ab8b7a2b6fa306a6f472f66d1890dd83e192ac57db33ae0f"} Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.151590 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzhq7" event={"ID":"f4be7c3a-dabf-4f6d-8488-17f680198610","Type":"ContainerStarted","Data":"d7b962bd1aa34d37a0ecdcfa0f3c04b6db3e5b1bb5eb87fb955c9fd8f5ff83b3"} Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.153346 4733 generic.go:334] "Generic (PLEG): container finished" podID="87626d39-c79f-487c-819c-95eec3d5d5a3" containerID="32c670603a4617b34ff582536e42eece2b2432b5fb12d782f5d3c9cc19665158" exitCode=0 Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.153377 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76k6r" event={"ID":"87626d39-c79f-487c-819c-95eec3d5d5a3","Type":"ContainerDied","Data":"32c670603a4617b34ff582536e42eece2b2432b5fb12d782f5d3c9cc19665158"} Dec 06 05:49:11 crc kubenswrapper[4733]: I1206 05:49:11.153409 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76k6r" event={"ID":"87626d39-c79f-487c-819c-95eec3d5d5a3","Type":"ContainerStarted","Data":"6ecfdab10dec44d6ea20a455509f221eefecdd475f5d42b885b3390976b83419"} Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:11.966048 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:11.968702 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:11.972080 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:11.973444 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.028983 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4zx\" (UniqueName: \"kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.029082 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.029122 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.130920 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.130957 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.131014 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4zx\" (UniqueName: \"kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.131686 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.131904 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.162484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4zx\" (UniqueName: \"kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx\") pod \"community-operators-6czlb\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.166266 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8pgx"] Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.167135 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.167496 4733 generic.go:334] "Generic (PLEG): container finished" podID="87626d39-c79f-487c-819c-95eec3d5d5a3" containerID="90db602114a1b7aa6c5de707477a53df4f5769eae86cdc3c4c90910e5dbd645e" exitCode=0 Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.167538 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76k6r" event={"ID":"87626d39-c79f-487c-819c-95eec3d5d5a3","Type":"ContainerDied","Data":"90db602114a1b7aa6c5de707477a53df4f5769eae86cdc3c4c90910e5dbd645e"} Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.169454 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.173591 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzhq7" event={"ID":"f4be7c3a-dabf-4f6d-8488-17f680198610","Type":"ContainerStarted","Data":"7b6072143d269ad194de5523288f28eec9be8ed2e32810119179532e14ed2d3e"} Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.177535 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8pgx"] Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.288039 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.333484 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-catalog-content\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.333546 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlnp\" (UniqueName: \"kubernetes.io/projected/db6bf699-eb17-41f2-a2be-e30f7a341840-kube-api-access-zwlnp\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.333590 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-utilities\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.435752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-utilities\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.436658 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-catalog-content\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.437556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlnp\" (UniqueName: \"kubernetes.io/projected/db6bf699-eb17-41f2-a2be-e30f7a341840-kube-api-access-zwlnp\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.439398 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-utilities\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.440451 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db6bf699-eb17-41f2-a2be-e30f7a341840-catalog-content\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.455906 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlnp\" (UniqueName: \"kubernetes.io/projected/db6bf699-eb17-41f2-a2be-e30f7a341840-kube-api-access-zwlnp\") pod \"certified-operators-g8pgx\" (UID: \"db6bf699-eb17-41f2-a2be-e30f7a341840\") " pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.490149 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.641422 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:49:12 crc kubenswrapper[4733]: W1206 05:49:12.647054 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda462e608_8ce9_449b_897d_c5fd47649a86.slice/crio-3bd098c8d7b254c6091d19466b58e2089913be7eec19436b186b52421164c0b5 WatchSource:0}: Error finding container 3bd098c8d7b254c6091d19466b58e2089913be7eec19436b186b52421164c0b5: Status 404 returned error can't find the container with id 3bd098c8d7b254c6091d19466b58e2089913be7eec19436b186b52421164c0b5 Dec 06 05:49:12 crc kubenswrapper[4733]: I1206 05:49:12.869700 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8pgx"] Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.181272 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4be7c3a-dabf-4f6d-8488-17f680198610" containerID="7b6072143d269ad194de5523288f28eec9be8ed2e32810119179532e14ed2d3e" exitCode=0 Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.181338 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzhq7" event={"ID":"f4be7c3a-dabf-4f6d-8488-17f680198610","Type":"ContainerDied","Data":"7b6072143d269ad194de5523288f28eec9be8ed2e32810119179532e14ed2d3e"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.184846 4733 generic.go:334] "Generic (PLEG): container finished" podID="db6bf699-eb17-41f2-a2be-e30f7a341840" containerID="1f2a2b2a7e6b3031bbc9e742c31533c88d26e1456313996999f9d78aa1556552" exitCode=0 Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.184947 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8pgx" event={"ID":"db6bf699-eb17-41f2-a2be-e30f7a341840","Type":"ContainerDied","Data":"1f2a2b2a7e6b3031bbc9e742c31533c88d26e1456313996999f9d78aa1556552"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.185008 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8pgx" event={"ID":"db6bf699-eb17-41f2-a2be-e30f7a341840","Type":"ContainerStarted","Data":"0a61d8e48ea0b55419c8e135670ecf870943466cf333b5f94bf97ad1a38bf712"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.190376 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76k6r" event={"ID":"87626d39-c79f-487c-819c-95eec3d5d5a3","Type":"ContainerStarted","Data":"89ba3b14776c6e7336b9802647dddbf58b78bbf07eb562e3b6516afb73f1a42d"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.193725 4733 generic.go:334] "Generic (PLEG): container finished" podID="a462e608-8ce9-449b-897d-c5fd47649a86" containerID="02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0" exitCode=0 Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.193778 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerDied","Data":"02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.193803 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerStarted","Data":"3bd098c8d7b254c6091d19466b58e2089913be7eec19436b186b52421164c0b5"} Dec 06 05:49:13 crc kubenswrapper[4733]: I1206 05:49:13.237814 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76k6r" podStartSLOduration=2.70485614 podStartE2EDuration="4.237793055s" podCreationTimestamp="2025-12-06 05:49:09 +0000 UTC" firstStartedPulling="2025-12-06 05:49:11.155095869 +0000 UTC m=+335.020306980" lastFinishedPulling="2025-12-06 05:49:12.688032784 +0000 UTC m=+336.553243895" observedRunningTime="2025-12-06 05:49:13.233560462 +0000 UTC m=+337.098771573" watchObservedRunningTime="2025-12-06 05:49:13.237793055 +0000 UTC m=+337.103004166" Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.201035 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzhq7" event={"ID":"f4be7c3a-dabf-4f6d-8488-17f680198610","Type":"ContainerStarted","Data":"d0d82e04cc8076ff904dbab15f669a0018cb6455454d340c894742aad671db76"} Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.202676 4733 generic.go:334] "Generic (PLEG): container finished" podID="db6bf699-eb17-41f2-a2be-e30f7a341840" containerID="fde0806602d36fb07a04bc3a74f08e39aeff669ce36c4f330598f0fa618a9b8d" exitCode=0 Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.202742 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8pgx" event={"ID":"db6bf699-eb17-41f2-a2be-e30f7a341840","Type":"ContainerDied","Data":"fde0806602d36fb07a04bc3a74f08e39aeff669ce36c4f330598f0fa618a9b8d"} Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.208367 4733 generic.go:334] "Generic (PLEG): container finished" podID="a462e608-8ce9-449b-897d-c5fd47649a86" containerID="ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f" exitCode=0 Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.208449 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerDied","Data":"ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f"} Dec 06 05:49:14 crc kubenswrapper[4733]: I1206 05:49:14.220954 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tzhq7" podStartSLOduration=2.687759166 podStartE2EDuration="5.22093859s" podCreationTimestamp="2025-12-06 05:49:09 +0000 UTC" firstStartedPulling="2025-12-06 05:49:11.152993044 +0000 UTC m=+335.018204154" lastFinishedPulling="2025-12-06 05:49:13.686172468 +0000 UTC m=+337.551383578" observedRunningTime="2025-12-06 05:49:14.219507448 +0000 UTC m=+338.084718559" watchObservedRunningTime="2025-12-06 05:49:14.22093859 +0000 UTC m=+338.086149701" Dec 06 05:49:15 crc kubenswrapper[4733]: I1206 05:49:15.216644 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8pgx" event={"ID":"db6bf699-eb17-41f2-a2be-e30f7a341840","Type":"ContainerStarted","Data":"37bc1292718a1d7a9b38b087488b5cfd211d663ed0093867d52295e268ed4f71"} Dec 06 05:49:15 crc kubenswrapper[4733]: I1206 05:49:15.219858 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerStarted","Data":"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56"} Dec 06 05:49:15 crc kubenswrapper[4733]: I1206 05:49:15.246803 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8pgx" podStartSLOduration=1.741829394 podStartE2EDuration="3.246787763s" podCreationTimestamp="2025-12-06 05:49:12 +0000 UTC" firstStartedPulling="2025-12-06 05:49:13.186224057 +0000 UTC m=+337.051435168" lastFinishedPulling="2025-12-06 05:49:14.691182426 +0000 UTC m=+338.556393537" observedRunningTime="2025-12-06 05:49:15.242133638 +0000 UTC m=+339.107344748" watchObservedRunningTime="2025-12-06 05:49:15.246787763 +0000 UTC m=+339.111998875" Dec 06 05:49:15 crc kubenswrapper[4733]: I1206 05:49:15.257764 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6czlb" podStartSLOduration=2.756025934 podStartE2EDuration="4.257745278s" podCreationTimestamp="2025-12-06 05:49:11 +0000 UTC" firstStartedPulling="2025-12-06 05:49:13.195453041 +0000 UTC m=+337.060664152" lastFinishedPulling="2025-12-06 05:49:14.697172385 +0000 UTC m=+338.562383496" observedRunningTime="2025-12-06 05:49:15.2553916 +0000 UTC m=+339.120602711" watchObservedRunningTime="2025-12-06 05:49:15.257745278 +0000 UTC m=+339.122956379" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.309988 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ncprc"] Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.311187 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.325908 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ncprc"] Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439033 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-trusted-ca\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439211 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a68e81ea-d56b-414e-8a92-106bf00cfe43-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439258 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-bound-sa-token\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439379 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-tls\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439420 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2l9\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-kube-api-access-wt2l9\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439600 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-certificates\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439658 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a68e81ea-d56b-414e-8a92-106bf00cfe43-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.439752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.477814 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541150 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-trusted-ca\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541208 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a68e81ea-d56b-414e-8a92-106bf00cfe43-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541226 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-bound-sa-token\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541249 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-tls\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541267 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2l9\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-kube-api-access-wt2l9\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541290 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-certificates\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541330 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a68e81ea-d56b-414e-8a92-106bf00cfe43-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.541984 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a68e81ea-d56b-414e-8a92-106bf00cfe43-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.542524 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-trusted-ca\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.542530 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-certificates\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.546657 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a68e81ea-d56b-414e-8a92-106bf00cfe43-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.548714 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-registry-tls\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.554455 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-bound-sa-token\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.555729 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2l9\" (UniqueName: \"kubernetes.io/projected/a68e81ea-d56b-414e-8a92-106bf00cfe43-kube-api-access-wt2l9\") pod \"image-registry-66df7c8f76-ncprc\" (UID: \"a68e81ea-d56b-414e-8a92-106bf00cfe43\") " pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.623891 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.886073 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.886538 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:19 crc kubenswrapper[4733]: I1206 05:49:19.924486 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.076238 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ncprc"] Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.077270 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.077315 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.113851 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.245350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" event={"ID":"a68e81ea-d56b-414e-8a92-106bf00cfe43","Type":"ContainerStarted","Data":"9aab9c0af98c05f2d8447db55aa670c8664e5e1473597b577e6ab6fcd7f32d32"} Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.245407 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" event={"ID":"a68e81ea-d56b-414e-8a92-106bf00cfe43","Type":"ContainerStarted","Data":"8b3868bb36f855e82f5bb421c6dd0ecfb01a9bcf8bf95d576f24ebb157632319"} Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.281796 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" podStartSLOduration=1.2817773350000001 podStartE2EDuration="1.281777335s" podCreationTimestamp="2025-12-06 05:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:49:20.275029921 +0000 UTC m=+344.140241032" watchObservedRunningTime="2025-12-06 05:49:20.281777335 +0000 UTC m=+344.146988446" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.284066 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tzhq7" Dec 06 05:49:20 crc kubenswrapper[4733]: I1206 05:49:20.291815 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76k6r" Dec 06 05:49:21 crc kubenswrapper[4733]: I1206 05:49:21.250564 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.288627 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.288670 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.325845 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.491327 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.491377 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:22 crc kubenswrapper[4733]: I1206 05:49:22.521171 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:23 crc kubenswrapper[4733]: I1206 05:49:23.288745 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:49:23 crc kubenswrapper[4733]: I1206 05:49:23.288807 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8pgx" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.162067 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.162797 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" podUID="784bb05e-7593-4b25-88da-7131c4f89466" containerName="controller-manager" containerID="cri-o://1ab80b089bbe7ed44d82b5a18582b983b0e7f2124e1335cf596594b4346e9449" gracePeriod=30 Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.312686 4733 generic.go:334] "Generic (PLEG): container finished" podID="784bb05e-7593-4b25-88da-7131c4f89466" containerID="1ab80b089bbe7ed44d82b5a18582b983b0e7f2124e1335cf596594b4346e9449" exitCode=0 Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.312743 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" event={"ID":"784bb05e-7593-4b25-88da-7131c4f89466","Type":"ContainerDied","Data":"1ab80b089bbe7ed44d82b5a18582b983b0e7f2124e1335cf596594b4346e9449"} Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.503334 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.613345 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert\") pod \"784bb05e-7593-4b25-88da-7131c4f89466\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.613399 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca\") pod \"784bb05e-7593-4b25-88da-7131c4f89466\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.613485 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bc5c\" (UniqueName: \"kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c\") pod \"784bb05e-7593-4b25-88da-7131c4f89466\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.613540 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config\") pod \"784bb05e-7593-4b25-88da-7131c4f89466\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.613558 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles\") pod \"784bb05e-7593-4b25-88da-7131c4f89466\" (UID: \"784bb05e-7593-4b25-88da-7131c4f89466\") " Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614212 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "784bb05e-7593-4b25-88da-7131c4f89466" (UID: "784bb05e-7593-4b25-88da-7131c4f89466"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614230 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca" (OuterVolumeSpecName: "client-ca") pod "784bb05e-7593-4b25-88da-7131c4f89466" (UID: "784bb05e-7593-4b25-88da-7131c4f89466"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614241 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config" (OuterVolumeSpecName: "config") pod "784bb05e-7593-4b25-88da-7131c4f89466" (UID: "784bb05e-7593-4b25-88da-7131c4f89466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614578 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614597 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.614609 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784bb05e-7593-4b25-88da-7131c4f89466-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.618349 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c" (OuterVolumeSpecName: "kube-api-access-6bc5c") pod "784bb05e-7593-4b25-88da-7131c4f89466" (UID: "784bb05e-7593-4b25-88da-7131c4f89466"). InnerVolumeSpecName "kube-api-access-6bc5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.619193 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "784bb05e-7593-4b25-88da-7131c4f89466" (UID: "784bb05e-7593-4b25-88da-7131c4f89466"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.716196 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bc5c\" (UniqueName: \"kubernetes.io/projected/784bb05e-7593-4b25-88da-7131c4f89466-kube-api-access-6bc5c\") on node \"crc\" DevicePath \"\"" Dec 06 05:49:32 crc kubenswrapper[4733]: I1206 05:49:32.716238 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784bb05e-7593-4b25-88da-7131c4f89466-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.237276 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b46b67f87-lb879"] Dec 06 05:49:33 crc kubenswrapper[4733]: E1206 05:49:33.238213 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bb05e-7593-4b25-88da-7131c4f89466" containerName="controller-manager" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.238246 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bb05e-7593-4b25-88da-7131c4f89466" containerName="controller-manager" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.238402 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="784bb05e-7593-4b25-88da-7131c4f89466" containerName="controller-manager" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.238876 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.245905 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b46b67f87-lb879"] Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.322553 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" event={"ID":"784bb05e-7593-4b25-88da-7131c4f89466","Type":"ContainerDied","Data":"5bb8b0bcabad950581edb1382178a112ac220d818e265737bf74ee411020eedf"} Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.322596 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-h694b" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.322631 4733 scope.go:117] "RemoveContainer" containerID="1ab80b089bbe7ed44d82b5a18582b983b0e7f2124e1335cf596594b4346e9449" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.336763 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknds\" (UniqueName: \"kubernetes.io/projected/e7c50ff5-5420-4d4f-80f7-1087931cbd96-kube-api-access-fknds\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.336858 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c50ff5-5420-4d4f-80f7-1087931cbd96-serving-cert\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.336887 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-client-ca\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.337075 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-config\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.337243 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-proxy-ca-bundles\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.350860 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.354949 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-h694b"] Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.438502 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-config\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.438551 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-proxy-ca-bundles\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.438592 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknds\" (UniqueName: \"kubernetes.io/projected/e7c50ff5-5420-4d4f-80f7-1087931cbd96-kube-api-access-fknds\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.438620 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c50ff5-5420-4d4f-80f7-1087931cbd96-serving-cert\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.438646 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-client-ca\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.439820 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-client-ca\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.440188 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-config\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.440567 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7c50ff5-5420-4d4f-80f7-1087931cbd96-proxy-ca-bundles\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.445429 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c50ff5-5420-4d4f-80f7-1087931cbd96-serving-cert\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.452596 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknds\" (UniqueName: \"kubernetes.io/projected/e7c50ff5-5420-4d4f-80f7-1087931cbd96-kube-api-access-fknds\") pod \"controller-manager-5b46b67f87-lb879\" (UID: \"e7c50ff5-5420-4d4f-80f7-1087931cbd96\") " pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.551898 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:33 crc kubenswrapper[4733]: I1206 05:49:33.912978 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b46b67f87-lb879"] Dec 06 05:49:33 crc kubenswrapper[4733]: W1206 05:49:33.919492 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c50ff5_5420_4d4f_80f7_1087931cbd96.slice/crio-72d65ecdb5b8f326d18fdd1c2155f51404f29beb1dcfd584e7e5ea7ff1711e1e WatchSource:0}: Error finding container 72d65ecdb5b8f326d18fdd1c2155f51404f29beb1dcfd584e7e5ea7ff1711e1e: Status 404 returned error can't find the container with id 72d65ecdb5b8f326d18fdd1c2155f51404f29beb1dcfd584e7e5ea7ff1711e1e Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.330834 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" event={"ID":"e7c50ff5-5420-4d4f-80f7-1087931cbd96","Type":"ContainerStarted","Data":"c77b22b6fff8e04c505868e9af577f222467c4fbac741d12ec3e77821d5fccc9"} Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.330896 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" event={"ID":"e7c50ff5-5420-4d4f-80f7-1087931cbd96","Type":"ContainerStarted","Data":"72d65ecdb5b8f326d18fdd1c2155f51404f29beb1dcfd584e7e5ea7ff1711e1e"} Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.331218 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.335164 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.352670 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b46b67f87-lb879" podStartSLOduration=2.3526484610000002 podStartE2EDuration="2.352648461s" podCreationTimestamp="2025-12-06 05:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:49:34.347212834 +0000 UTC m=+358.212423945" watchObservedRunningTime="2025-12-06 05:49:34.352648461 +0000 UTC m=+358.217859571" Dec 06 05:49:34 crc kubenswrapper[4733]: I1206 05:49:34.490897 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784bb05e-7593-4b25-88da-7131c4f89466" path="/var/lib/kubelet/pods/784bb05e-7593-4b25-88da-7131c4f89466/volumes" Dec 06 05:49:39 crc kubenswrapper[4733]: I1206 05:49:39.628896 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ncprc" Dec 06 05:49:39 crc kubenswrapper[4733]: I1206 05:49:39.674965 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:49:42 crc kubenswrapper[4733]: I1206 05:49:42.989746 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:49:42 crc kubenswrapper[4733]: I1206 05:49:42.990981 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:50:04 crc kubenswrapper[4733]: I1206 05:50:04.707297 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" podUID="6e073151-939a-4209-8cd7-39116b0165f0" containerName="registry" containerID="cri-o://aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31" gracePeriod=30 Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.103537 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.227706 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228114 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228172 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228215 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228378 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228497 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4895\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228586 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.228654 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token\") pod \"6e073151-939a-4209-8cd7-39116b0165f0\" (UID: \"6e073151-939a-4209-8cd7-39116b0165f0\") " Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.229120 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.229517 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.234794 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.236287 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895" (OuterVolumeSpecName: "kube-api-access-l4895") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "kube-api-access-l4895". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.237031 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.237998 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.239035 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.241770 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6e073151-939a-4209-8cd7-39116b0165f0" (UID: "6e073151-939a-4209-8cd7-39116b0165f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330414 4733 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e073151-939a-4209-8cd7-39116b0165f0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330450 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330461 4733 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330472 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4895\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-kube-api-access-l4895\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330480 4733 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e073151-939a-4209-8cd7-39116b0165f0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330489 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e073151-939a-4209-8cd7-39116b0165f0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.330499 4733 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e073151-939a-4209-8cd7-39116b0165f0-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.514567 4733 generic.go:334] "Generic (PLEG): container finished" podID="6e073151-939a-4209-8cd7-39116b0165f0" containerID="aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31" exitCode=0 Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.514609 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" event={"ID":"6e073151-939a-4209-8cd7-39116b0165f0","Type":"ContainerDied","Data":"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31"} Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.514614 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.514635 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7rlhm" event={"ID":"6e073151-939a-4209-8cd7-39116b0165f0","Type":"ContainerDied","Data":"b3c48e833667e6d8679fd24078e942bc2036bb8847660c7821bd4ef37bcf2cb1"} Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.514659 4733 scope.go:117] "RemoveContainer" containerID="aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.529018 4733 scope.go:117] "RemoveContainer" containerID="aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31" Dec 06 05:50:05 crc kubenswrapper[4733]: E1206 05:50:05.529459 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31\": container with ID starting with aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31 not found: ID does not exist" containerID="aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.529496 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31"} err="failed to get container status \"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31\": rpc error: code = NotFound desc = could not find container \"aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31\": container with ID starting with aa6bd577f5de0d6f277a4191e9a13947bcd2c89d33e8fd1c39b6b31b217e0d31 not found: ID does not exist" Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.540314 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:50:05 crc kubenswrapper[4733]: I1206 05:50:05.546863 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7rlhm"] Dec 06 05:50:06 crc kubenswrapper[4733]: I1206 05:50:06.492728 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e073151-939a-4209-8cd7-39116b0165f0" path="/var/lib/kubelet/pods/6e073151-939a-4209-8cd7-39116b0165f0/volumes" Dec 06 05:50:12 crc kubenswrapper[4733]: I1206 05:50:12.989120 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:50:12 crc kubenswrapper[4733]: I1206 05:50:12.989986 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.654991 4733 scope.go:117] "RemoveContainer" containerID="801ea1b9ed221d20f0d729436b8f5f1946df6e66f06aa86db5764f18da3f0b1f" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.667115 4733 scope.go:117] "RemoveContainer" containerID="7eeebbb46cf11d2306ad457106c3b2179039986bfdd412c4bb64791d86edb4e0" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.678252 4733 scope.go:117] "RemoveContainer" containerID="9addcd70430289d4b9e51cbab421c76f62dfbc60934130c77b42a3a442adc33f" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.700942 4733 scope.go:117] "RemoveContainer" containerID="57cbb938bc4ae9b8a71a1e2369a50a243964fc8c683d2d1840f1f3e199f1b923" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.712025 4733 scope.go:117] "RemoveContainer" containerID="32c4d87738481c8df3d76e820a98f3dacfbc11edc26fab1dfe51b56d207168d2" Dec 06 05:50:36 crc kubenswrapper[4733]: I1206 05:50:36.722647 4733 scope.go:117] "RemoveContainer" containerID="e8edc1fd8220a58b6a3f6d08d6d003c6d350fa69588866d84de63f95ecd4367f" Dec 06 05:50:42 crc kubenswrapper[4733]: I1206 05:50:42.988967 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:50:42 crc kubenswrapper[4733]: I1206 05:50:42.990035 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:50:42 crc kubenswrapper[4733]: I1206 05:50:42.990081 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:50:42 crc kubenswrapper[4733]: I1206 05:50:42.990633 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 05:50:42 crc kubenswrapper[4733]: I1206 05:50:42.990688 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77" gracePeriod=600 Dec 06 05:50:43 crc kubenswrapper[4733]: I1206 05:50:43.703869 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77" exitCode=0 Dec 06 05:50:43 crc kubenswrapper[4733]: I1206 05:50:43.704062 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77"} Dec 06 05:50:43 crc kubenswrapper[4733]: I1206 05:50:43.704199 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10"} Dec 06 05:50:43 crc kubenswrapper[4733]: I1206 05:50:43.704222 4733 scope.go:117] "RemoveContainer" containerID="61a23652af66be599ba9357cb31709e7b4a3f0e4767c758617e6cc5cd9b43941" Dec 06 05:52:39 crc kubenswrapper[4733]: I1206 05:52:39.999249 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wt468"] Dec 06 05:52:40 crc kubenswrapper[4733]: E1206 05:52:39.999860 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e073151-939a-4209-8cd7-39116b0165f0" containerName="registry" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:39.999875 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e073151-939a-4209-8cd7-39116b0165f0" containerName="registry" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:39.999994 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e073151-939a-4209-8cd7-39116b0165f0" containerName="registry" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.000440 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.002624 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qn46p" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.002761 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.002818 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.007480 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fr5kh"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.008348 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-fr5kh" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.010385 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-svflg" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.010796 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ptmkp"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.011360 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.013647 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-c6g4n" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.024483 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94w8c\" (UniqueName: \"kubernetes.io/projected/69538b45-07e5-4c3d-a653-b10e62688290-kube-api-access-94w8c\") pod \"cert-manager-cainjector-7f985d654d-wt468\" (UID: \"69538b45-07e5-4c3d-a653-b10e62688290\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.024562 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z95lg\" (UniqueName: \"kubernetes.io/projected/0e8869e6-7869-47c0-a412-4a4cfa676164-kube-api-access-z95lg\") pod \"cert-manager-5b446d88c5-fr5kh\" (UID: \"0e8869e6-7869-47c0-a412-4a4cfa676164\") " pod="cert-manager/cert-manager-5b446d88c5-fr5kh" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.024596 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrn6s\" (UniqueName: \"kubernetes.io/projected/5d9f07d4-edb9-4fba-8043-dd05fe08afbb-kube-api-access-rrn6s\") pod \"cert-manager-webhook-5655c58dd6-ptmkp\" (UID: \"5d9f07d4-edb9-4fba-8043-dd05fe08afbb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.024899 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ptmkp"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.027244 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fr5kh"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.047533 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wt468"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.125829 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94w8c\" (UniqueName: \"kubernetes.io/projected/69538b45-07e5-4c3d-a653-b10e62688290-kube-api-access-94w8c\") pod \"cert-manager-cainjector-7f985d654d-wt468\" (UID: \"69538b45-07e5-4c3d-a653-b10e62688290\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.126001 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z95lg\" (UniqueName: \"kubernetes.io/projected/0e8869e6-7869-47c0-a412-4a4cfa676164-kube-api-access-z95lg\") pod \"cert-manager-5b446d88c5-fr5kh\" (UID: \"0e8869e6-7869-47c0-a412-4a4cfa676164\") " pod="cert-manager/cert-manager-5b446d88c5-fr5kh" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.126101 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrn6s\" (UniqueName: \"kubernetes.io/projected/5d9f07d4-edb9-4fba-8043-dd05fe08afbb-kube-api-access-rrn6s\") pod \"cert-manager-webhook-5655c58dd6-ptmkp\" (UID: \"5d9f07d4-edb9-4fba-8043-dd05fe08afbb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.141904 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z95lg\" (UniqueName: \"kubernetes.io/projected/0e8869e6-7869-47c0-a412-4a4cfa676164-kube-api-access-z95lg\") pod \"cert-manager-5b446d88c5-fr5kh\" (UID: \"0e8869e6-7869-47c0-a412-4a4cfa676164\") " pod="cert-manager/cert-manager-5b446d88c5-fr5kh" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.142015 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94w8c\" (UniqueName: \"kubernetes.io/projected/69538b45-07e5-4c3d-a653-b10e62688290-kube-api-access-94w8c\") pod \"cert-manager-cainjector-7f985d654d-wt468\" (UID: \"69538b45-07e5-4c3d-a653-b10e62688290\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.142110 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrn6s\" (UniqueName: \"kubernetes.io/projected/5d9f07d4-edb9-4fba-8043-dd05fe08afbb-kube-api-access-rrn6s\") pod \"cert-manager-webhook-5655c58dd6-ptmkp\" (UID: \"5d9f07d4-edb9-4fba-8043-dd05fe08afbb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.312693 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.321527 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-fr5kh" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.327044 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.497649 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.503228 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fr5kh"] Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.717736 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wt468"] Dec 06 05:52:40 crc kubenswrapper[4733]: W1206 05:52:40.721416 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69538b45_07e5_4c3d_a653_b10e62688290.slice/crio-1419973f27c3896d4f20b2256d3233b4c59c6a348296278d08ba4acd7a0df24b WatchSource:0}: Error finding container 1419973f27c3896d4f20b2256d3233b4c59c6a348296278d08ba4acd7a0df24b: Status 404 returned error can't find the container with id 1419973f27c3896d4f20b2256d3233b4c59c6a348296278d08ba4acd7a0df24b Dec 06 05:52:40 crc kubenswrapper[4733]: I1206 05:52:40.725272 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ptmkp"] Dec 06 05:52:40 crc kubenswrapper[4733]: W1206 05:52:40.727502 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9f07d4_edb9_4fba_8043_dd05fe08afbb.slice/crio-59d41264ac5744b97f37b92666aa859b260d96a8def85ef477f4f2a975e40b07 WatchSource:0}: Error finding container 59d41264ac5744b97f37b92666aa859b260d96a8def85ef477f4f2a975e40b07: Status 404 returned error can't find the container with id 59d41264ac5744b97f37b92666aa859b260d96a8def85ef477f4f2a975e40b07 Dec 06 05:52:41 crc kubenswrapper[4733]: I1206 05:52:41.257191 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" event={"ID":"69538b45-07e5-4c3d-a653-b10e62688290","Type":"ContainerStarted","Data":"1419973f27c3896d4f20b2256d3233b4c59c6a348296278d08ba4acd7a0df24b"} Dec 06 05:52:41 crc kubenswrapper[4733]: I1206 05:52:41.258086 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" event={"ID":"5d9f07d4-edb9-4fba-8043-dd05fe08afbb","Type":"ContainerStarted","Data":"59d41264ac5744b97f37b92666aa859b260d96a8def85ef477f4f2a975e40b07"} Dec 06 05:52:41 crc kubenswrapper[4733]: I1206 05:52:41.259221 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-fr5kh" event={"ID":"0e8869e6-7869-47c0-a412-4a4cfa676164","Type":"ContainerStarted","Data":"4b9392d6cf33937782140af300bfaa6237fdc1d541078bee7b07a3bb8705eb31"} Dec 06 05:52:43 crc kubenswrapper[4733]: I1206 05:52:43.274835 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" event={"ID":"69538b45-07e5-4c3d-a653-b10e62688290","Type":"ContainerStarted","Data":"6c56dd7a7ab701f5f3aa34dad5edda61fb0bdfd6b0a8f1d3e669bb57bf131fbe"} Dec 06 05:52:43 crc kubenswrapper[4733]: I1206 05:52:43.277986 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-fr5kh" event={"ID":"0e8869e6-7869-47c0-a412-4a4cfa676164","Type":"ContainerStarted","Data":"b529e068173f64966be647ea0d0669fb85f2ced9ea597fe90f6ecae8e557e409"} Dec 06 05:52:43 crc kubenswrapper[4733]: I1206 05:52:43.289175 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wt468" podStartSLOduration=2.073229585 podStartE2EDuration="4.289165222s" podCreationTimestamp="2025-12-06 05:52:39 +0000 UTC" firstStartedPulling="2025-12-06 05:52:40.723155817 +0000 UTC m=+544.588366927" lastFinishedPulling="2025-12-06 05:52:42.939091453 +0000 UTC m=+546.804302564" observedRunningTime="2025-12-06 05:52:43.287967348 +0000 UTC m=+547.153178459" watchObservedRunningTime="2025-12-06 05:52:43.289165222 +0000 UTC m=+547.154376333" Dec 06 05:52:43 crc kubenswrapper[4733]: I1206 05:52:43.305294 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-fr5kh" podStartSLOduration=2.337432918 podStartE2EDuration="4.305272737s" podCreationTimestamp="2025-12-06 05:52:39 +0000 UTC" firstStartedPulling="2025-12-06 05:52:40.497408922 +0000 UTC m=+544.362620033" lastFinishedPulling="2025-12-06 05:52:42.465248741 +0000 UTC m=+546.330459852" observedRunningTime="2025-12-06 05:52:43.302937875 +0000 UTC m=+547.168148987" watchObservedRunningTime="2025-12-06 05:52:43.305272737 +0000 UTC m=+547.170483847" Dec 06 05:52:44 crc kubenswrapper[4733]: I1206 05:52:44.286396 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" event={"ID":"5d9f07d4-edb9-4fba-8043-dd05fe08afbb","Type":"ContainerStarted","Data":"3a63c730757b4a0163d6ffd687c197234ed34ec8568c05935c26e6655d13edbe"} Dec 06 05:52:44 crc kubenswrapper[4733]: I1206 05:52:44.302515 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" podStartSLOduration=2.338544777 podStartE2EDuration="5.30248446s" podCreationTimestamp="2025-12-06 05:52:39 +0000 UTC" firstStartedPulling="2025-12-06 05:52:40.729345527 +0000 UTC m=+544.594556638" lastFinishedPulling="2025-12-06 05:52:43.69328521 +0000 UTC m=+547.558496321" observedRunningTime="2025-12-06 05:52:44.298029282 +0000 UTC m=+548.163240393" watchObservedRunningTime="2025-12-06 05:52:44.30248446 +0000 UTC m=+548.167695571" Dec 06 05:52:45 crc kubenswrapper[4733]: I1206 05:52:45.291383 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:50 crc kubenswrapper[4733]: I1206 05:52:50.330235 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-ptmkp" Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.732763 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2gb79"] Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733351 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-controller" containerID="cri-o://88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733460 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="northd" containerID="cri-o://77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733432 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="nbdb" containerID="cri-o://456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733571 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733615 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-node" containerID="cri-o://a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.733733 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-acl-logging" containerID="cri-o://d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.738195 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="sbdb" containerID="cri-o://9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" gracePeriod=30 Dec 06 05:52:51 crc kubenswrapper[4733]: I1206 05:52:51.757468 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" containerID="cri-o://d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" gracePeriod=30 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.020298 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/3.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.022592 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovn-acl-logging/0.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.023086 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovn-controller/0.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.023526 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081030 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bkmks"] Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081457 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="nbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081531 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="nbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081591 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081639 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081689 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="sbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081736 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="sbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081783 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081830 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081881 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-acl-logging" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.081927 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-acl-logging" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.081974 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082022 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082067 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082111 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082161 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kubecfg-setup" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082202 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kubecfg-setup" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082248 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082292 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082376 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-node" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082424 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-node" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082472 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="northd" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082514 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="northd" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.082563 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082609 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082752 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="nbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082801 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-node" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082850 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="sbdb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082900 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-acl-logging" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082946 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="northd" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.082989 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083034 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083080 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083132 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovn-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083243 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083294 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.083465 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083520 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.083651 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" containerName="ovnkube-controller" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.085106 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157434 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157483 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c5fg\" (UniqueName: \"kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157509 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157525 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157562 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157595 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157587 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157610 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157647 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket" (OuterVolumeSpecName: "log-socket") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157689 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157717 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157743 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157765 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157803 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157850 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157873 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157915 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157953 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.157981 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158020 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158040 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides\") pod \"171aa174-9338-4421-8393-9e23fbab7f1e\" (UID: \"171aa174-9338-4421-8393-9e23fbab7f1e\") " Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158082 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158111 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158113 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158188 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158569 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158614 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158627 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158632 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log" (OuterVolumeSpecName: "node-log") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158677 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158650 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash" (OuterVolumeSpecName: "host-slash") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158717 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158739 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158871 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158986 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.158993 4733 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159043 4733 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159062 4733 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159078 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159092 4733 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159102 4733 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159114 4733 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159123 4733 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159132 4733 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159142 4733 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-log-socket\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159155 4733 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159167 4733 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159177 4733 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-node-log\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159189 4733 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-host-slash\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159200 4733 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.159279 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.166148 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.166924 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg" (OuterVolumeSpecName: "kube-api-access-9c5fg") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "kube-api-access-9c5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.173801 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "171aa174-9338-4421-8393-9e23fbab7f1e" (UID: "171aa174-9338-4421-8393-9e23fbab7f1e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260535 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-kubelet\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-bin\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260613 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260637 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovn-node-metrics-cert\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260655 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260678 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-netd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260702 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-slash\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260721 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-netns\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260776 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-ovn\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260797 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms259\" (UniqueName: \"kubernetes.io/projected/65ccebbf-20be-4b59-92ce-18a50f1d497c-kube-api-access-ms259\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260816 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260834 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-etc-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260854 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-log-socket\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260902 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-config\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260923 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-script-lib\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260946 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-systemd-units\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.260990 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-node-log\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261028 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-env-overrides\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261045 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-systemd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261062 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-var-lib-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261112 4733 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/171aa174-9338-4421-8393-9e23fbab7f1e-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261141 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261152 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/171aa174-9338-4421-8393-9e23fbab7f1e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261161 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c5fg\" (UniqueName: \"kubernetes.io/projected/171aa174-9338-4421-8393-9e23fbab7f1e-kube-api-access-9c5fg\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.261170 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/171aa174-9338-4421-8393-9e23fbab7f1e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.331105 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovnkube-controller/3.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.333438 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovn-acl-logging/0.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.333857 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2gb79_171aa174-9338-4421-8393-9e23fbab7f1e/ovn-controller/0.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334173 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334209 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334216 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334224 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334232 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334243 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" exitCode=0 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334250 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" exitCode=143 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334257 4733 generic.go:334] "Generic (PLEG): container finished" podID="171aa174-9338-4421-8393-9e23fbab7f1e" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" exitCode=143 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334272 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334270 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334357 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334371 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334383 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334393 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334402 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334414 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334429 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334436 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334442 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334458 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334462 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334466 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334577 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334589 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334597 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334606 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334618 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334625 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334631 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334640 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334645 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334650 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334657 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334662 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334668 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334674 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334681 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334688 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334695 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334702 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334709 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334715 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334724 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334731 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334736 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334741 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334748 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334754 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2gb79" event={"ID":"171aa174-9338-4421-8393-9e23fbab7f1e","Type":"ContainerDied","Data":"46e99ae8bab74c84d954e6100aa560275149b69e8155a7fcfa37a9ca61d66241"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334761 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334769 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334773 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334779 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334785 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334790 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334795 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334801 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334809 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.334815 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.336083 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/2.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.337175 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/1.log" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.337212 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc59542d-ee4a-414d-b096-86716cb56db5" containerID="3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6" exitCode=2 Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.337226 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerDied","Data":"3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.337248 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496"} Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.337997 4733 scope.go:117] "RemoveContainer" containerID="3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.338334 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-684r5_openshift-multus(cc59542d-ee4a-414d-b096-86716cb56db5)\"" pod="openshift-multus/multus-684r5" podUID="cc59542d-ee4a-414d-b096-86716cb56db5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.352593 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365081 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-kubelet\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365254 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-bin\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365581 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365609 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovn-node-metrics-cert\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365651 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365697 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-slash\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365733 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-netd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365753 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-netns\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365774 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-ovn\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365799 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms259\" (UniqueName: \"kubernetes.io/projected/65ccebbf-20be-4b59-92ce-18a50f1d497c-kube-api-access-ms259\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365822 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365860 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-etc-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365885 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-log-socket\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365923 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-config\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365958 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.365974 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-script-lib\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.366563 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-run-netns\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.366583 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-kubelet\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.366614 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-systemd-units\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367063 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367122 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-ovn\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-script-lib\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367525 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-slash\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367548 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-bin\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367570 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-cni-netd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367670 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367727 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-systemd-units\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367758 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-log-socket\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367777 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-etc-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.367756 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-node-log\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368370 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovnkube-config\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368414 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-node-log\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368563 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-env-overrides\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368595 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-systemd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-var-lib-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.368782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-var-lib-openvswitch\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.369124 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65ccebbf-20be-4b59-92ce-18a50f1d497c-env-overrides\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.369162 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65ccebbf-20be-4b59-92ce-18a50f1d497c-run-systemd\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.372836 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65ccebbf-20be-4b59-92ce-18a50f1d497c-ovn-node-metrics-cert\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.376031 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2gb79"] Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.385340 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2gb79"] Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.388135 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms259\" (UniqueName: \"kubernetes.io/projected/65ccebbf-20be-4b59-92ce-18a50f1d497c-kube-api-access-ms259\") pod \"ovnkube-node-bkmks\" (UID: \"65ccebbf-20be-4b59-92ce-18a50f1d497c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.388992 4733 scope.go:117] "RemoveContainer" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.397186 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.400844 4733 scope.go:117] "RemoveContainer" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.414448 4733 scope.go:117] "RemoveContainer" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.428660 4733 scope.go:117] "RemoveContainer" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.444172 4733 scope.go:117] "RemoveContainer" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.454868 4733 scope.go:117] "RemoveContainer" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.470962 4733 scope.go:117] "RemoveContainer" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.485608 4733 scope.go:117] "RemoveContainer" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.496109 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171aa174-9338-4421-8393-9e23fbab7f1e" path="/var/lib/kubelet/pods/171aa174-9338-4421-8393-9e23fbab7f1e/volumes" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.501865 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.502562 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.502654 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} err="failed to get container status \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.502717 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.503966 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": container with ID starting with 5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0 not found: ID does not exist" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.504038 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} err="failed to get container status \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": rpc error: code = NotFound desc = could not find container \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": container with ID starting with 5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.504077 4733 scope.go:117] "RemoveContainer" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.505945 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": container with ID starting with 9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c not found: ID does not exist" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506026 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} err="failed to get container status \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": rpc error: code = NotFound desc = could not find container \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": container with ID starting with 9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506072 4733 scope.go:117] "RemoveContainer" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.506497 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": container with ID starting with 456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb not found: ID does not exist" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506527 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} err="failed to get container status \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": rpc error: code = NotFound desc = could not find container \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": container with ID starting with 456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506544 4733 scope.go:117] "RemoveContainer" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.506868 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": container with ID starting with 77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45 not found: ID does not exist" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506914 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} err="failed to get container status \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": rpc error: code = NotFound desc = could not find container \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": container with ID starting with 77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.506938 4733 scope.go:117] "RemoveContainer" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.507258 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": container with ID starting with 532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490 not found: ID does not exist" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.507334 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} err="failed to get container status \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": rpc error: code = NotFound desc = could not find container \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": container with ID starting with 532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.507354 4733 scope.go:117] "RemoveContainer" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.507745 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": container with ID starting with a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88 not found: ID does not exist" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.507810 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} err="failed to get container status \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": rpc error: code = NotFound desc = could not find container \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": container with ID starting with a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.507827 4733 scope.go:117] "RemoveContainer" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.508201 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": container with ID starting with d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80 not found: ID does not exist" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.508248 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} err="failed to get container status \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": rpc error: code = NotFound desc = could not find container \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": container with ID starting with d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.508267 4733 scope.go:117] "RemoveContainer" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.508618 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": container with ID starting with 88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5 not found: ID does not exist" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.508722 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} err="failed to get container status \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": rpc error: code = NotFound desc = could not find container \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": container with ID starting with 88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.508740 4733 scope.go:117] "RemoveContainer" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: E1206 05:52:52.509064 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": container with ID starting with 667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236 not found: ID does not exist" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509088 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} err="failed to get container status \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": rpc error: code = NotFound desc = could not find container \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": container with ID starting with 667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509122 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509400 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} err="failed to get container status \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509444 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509727 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} err="failed to get container status \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": rpc error: code = NotFound desc = could not find container \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": container with ID starting with 5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.509766 4733 scope.go:117] "RemoveContainer" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510025 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} err="failed to get container status \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": rpc error: code = NotFound desc = could not find container \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": container with ID starting with 9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510045 4733 scope.go:117] "RemoveContainer" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510286 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} err="failed to get container status \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": rpc error: code = NotFound desc = could not find container \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": container with ID starting with 456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510334 4733 scope.go:117] "RemoveContainer" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510604 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} err="failed to get container status \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": rpc error: code = NotFound desc = could not find container \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": container with ID starting with 77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510626 4733 scope.go:117] "RemoveContainer" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510897 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} err="failed to get container status \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": rpc error: code = NotFound desc = could not find container \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": container with ID starting with 532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.510921 4733 scope.go:117] "RemoveContainer" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511205 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} err="failed to get container status \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": rpc error: code = NotFound desc = could not find container \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": container with ID starting with a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511227 4733 scope.go:117] "RemoveContainer" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511576 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} err="failed to get container status \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": rpc error: code = NotFound desc = could not find container \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": container with ID starting with d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511622 4733 scope.go:117] "RemoveContainer" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511926 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} err="failed to get container status \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": rpc error: code = NotFound desc = could not find container \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": container with ID starting with 88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.511952 4733 scope.go:117] "RemoveContainer" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512246 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} err="failed to get container status \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": rpc error: code = NotFound desc = could not find container \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": container with ID starting with 667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512268 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512633 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} err="failed to get container status \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512663 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512913 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} err="failed to get container status \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": rpc error: code = NotFound desc = could not find container \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": container with ID starting with 5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.512934 4733 scope.go:117] "RemoveContainer" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513186 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} err="failed to get container status \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": rpc error: code = NotFound desc = could not find container \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": container with ID starting with 9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513209 4733 scope.go:117] "RemoveContainer" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513489 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} err="failed to get container status \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": rpc error: code = NotFound desc = could not find container \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": container with ID starting with 456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513511 4733 scope.go:117] "RemoveContainer" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513747 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} err="failed to get container status \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": rpc error: code = NotFound desc = could not find container \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": container with ID starting with 77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513769 4733 scope.go:117] "RemoveContainer" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.513991 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} err="failed to get container status \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": rpc error: code = NotFound desc = could not find container \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": container with ID starting with 532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514008 4733 scope.go:117] "RemoveContainer" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514277 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} err="failed to get container status \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": rpc error: code = NotFound desc = could not find container \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": container with ID starting with a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514296 4733 scope.go:117] "RemoveContainer" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514669 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} err="failed to get container status \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": rpc error: code = NotFound desc = could not find container \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": container with ID starting with d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514692 4733 scope.go:117] "RemoveContainer" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514907 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} err="failed to get container status \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": rpc error: code = NotFound desc = could not find container \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": container with ID starting with 88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.514927 4733 scope.go:117] "RemoveContainer" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.515239 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} err="failed to get container status \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": rpc error: code = NotFound desc = could not find container \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": container with ID starting with 667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.515261 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.515658 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} err="failed to get container status \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.515680 4733 scope.go:117] "RemoveContainer" containerID="5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.515984 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0"} err="failed to get container status \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": rpc error: code = NotFound desc = could not find container \"5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0\": container with ID starting with 5d6353ff5837029f85cdae65e1200483030eeb8cb05c63bd255a459d79a91ef0 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516003 4733 scope.go:117] "RemoveContainer" containerID="9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516262 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c"} err="failed to get container status \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": rpc error: code = NotFound desc = could not find container \"9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c\": container with ID starting with 9980ec9b2b1a751a691d1f657a2176d49a7583906d741adbe3754ec4c73b152c not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516281 4733 scope.go:117] "RemoveContainer" containerID="456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516596 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb"} err="failed to get container status \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": rpc error: code = NotFound desc = could not find container \"456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb\": container with ID starting with 456b5bd863b30c044246c6c8fe15ee7344ad053861724b5c42b88479578b9adb not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516616 4733 scope.go:117] "RemoveContainer" containerID="77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516871 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45"} err="failed to get container status \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": rpc error: code = NotFound desc = could not find container \"77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45\": container with ID starting with 77216800c2b9bc04724591a5d5c5d4c9ddb9a75fcbc198c60800199a92db6f45 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.516892 4733 scope.go:117] "RemoveContainer" containerID="532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517122 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490"} err="failed to get container status \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": rpc error: code = NotFound desc = could not find container \"532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490\": container with ID starting with 532faf6ec4021a35746a236a1ded78eccc9d71728c149f73c4263068b6951490 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517142 4733 scope.go:117] "RemoveContainer" containerID="a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517513 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88"} err="failed to get container status \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": rpc error: code = NotFound desc = could not find container \"a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88\": container with ID starting with a697c5a28f2c415b6f133c1c3bdaff0915418e3fcf0c889af0a822e1bdcbcc88 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517532 4733 scope.go:117] "RemoveContainer" containerID="d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517946 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80"} err="failed to get container status \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": rpc error: code = NotFound desc = could not find container \"d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80\": container with ID starting with d985f342be7dff38ee8a2264a8dae534857e6cb0e7d0cf79b137d2ed6289bf80 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.517968 4733 scope.go:117] "RemoveContainer" containerID="88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.518260 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5"} err="failed to get container status \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": rpc error: code = NotFound desc = could not find container \"88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5\": container with ID starting with 88a99335c4d7fca93428173f7e0e096e418e0599ab030dfda10d8da0a5dc17a5 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.518282 4733 scope.go:117] "RemoveContainer" containerID="667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.519043 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236"} err="failed to get container status \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": rpc error: code = NotFound desc = could not find container \"667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236\": container with ID starting with 667a77abe2226e443847d05fc2475438095a30648777a424239d3f35c199b236 not found: ID does not exist" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.519072 4733 scope.go:117] "RemoveContainer" containerID="d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d" Dec 06 05:52:52 crc kubenswrapper[4733]: I1206 05:52:52.519565 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d"} err="failed to get container status \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": rpc error: code = NotFound desc = could not find container \"d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d\": container with ID starting with d78589d529f70a3ceb873f223ab2f95170d1ec069647c1cbdd258ed48da5d87d not found: ID does not exist" Dec 06 05:52:53 crc kubenswrapper[4733]: I1206 05:52:53.345200 4733 generic.go:334] "Generic (PLEG): container finished" podID="65ccebbf-20be-4b59-92ce-18a50f1d497c" containerID="be72e5fc613333afe5908cf8e09c98c4806946e4e5934e8c280aebfaa5dfc184" exitCode=0 Dec 06 05:52:53 crc kubenswrapper[4733]: I1206 05:52:53.345296 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerDied","Data":"be72e5fc613333afe5908cf8e09c98c4806946e4e5934e8c280aebfaa5dfc184"} Dec 06 05:52:53 crc kubenswrapper[4733]: I1206 05:52:53.346511 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"2b14a16f14854d4040fcd45507bd2e2290a1ee8e1b86b89d6198918be8e9a1cd"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359441 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"5ad80212b084b88779da02cbf388921993707eee5b20761e68b032050992b0f7"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359776 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"1aa6bad5bb5dd9344e714a3f42e57e4c79f63bf64c03bca0f77a42b1f05cc627"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359787 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"0655d774bd8efbe0e45d38cb26e11fdf154044dba7bf2dd13ed749d1b1f3d3f0"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359800 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"66b63c0b7dbc249b31d677892cdc23d146d9cae5665ae3822444c6f582a612d3"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359810 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"b73e55a150887400bb94de68af15270eb32fd4c956c04918c9d1478abcc27a3c"} Dec 06 05:52:54 crc kubenswrapper[4733]: I1206 05:52:54.359819 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"4a80e98effd8fb9b5ce9b7aa05efa3cdff9457a5344b8db7e779d5d20301e250"} Dec 06 05:52:56 crc kubenswrapper[4733]: I1206 05:52:56.375638 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"18a7e91a12ff2570fa9edf8642fb2831ca882c9b63b259610b6996dc275f0201"} Dec 06 05:52:58 crc kubenswrapper[4733]: I1206 05:52:58.391335 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" event={"ID":"65ccebbf-20be-4b59-92ce-18a50f1d497c","Type":"ContainerStarted","Data":"f83463007226f7cdb0d98d4e6e1f4335a5308b45f1e44e58a58b0b3fbb6a8e4a"} Dec 06 05:52:58 crc kubenswrapper[4733]: I1206 05:52:58.392023 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:58 crc kubenswrapper[4733]: I1206 05:52:58.392055 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:58 crc kubenswrapper[4733]: I1206 05:52:58.431379 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:58 crc kubenswrapper[4733]: I1206 05:52:58.432984 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" podStartSLOduration=6.432966224 podStartE2EDuration="6.432966224s" podCreationTimestamp="2025-12-06 05:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:52:58.424556898 +0000 UTC m=+562.289768009" watchObservedRunningTime="2025-12-06 05:52:58.432966224 +0000 UTC m=+562.298177336" Dec 06 05:52:59 crc kubenswrapper[4733]: I1206 05:52:59.396988 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:52:59 crc kubenswrapper[4733]: I1206 05:52:59.422973 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:53:03 crc kubenswrapper[4733]: I1206 05:53:03.485369 4733 scope.go:117] "RemoveContainer" containerID="3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6" Dec 06 05:53:03 crc kubenswrapper[4733]: E1206 05:53:03.485996 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-684r5_openshift-multus(cc59542d-ee4a-414d-b096-86716cb56db5)\"" pod="openshift-multus/multus-684r5" podUID="cc59542d-ee4a-414d-b096-86716cb56db5" Dec 06 05:53:12 crc kubenswrapper[4733]: I1206 05:53:12.989373 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:53:12 crc kubenswrapper[4733]: I1206 05:53:12.989942 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.452172 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx"] Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.453447 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.455546 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.464004 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx"] Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.485152 4733 scope.go:117] "RemoveContainer" containerID="3e3a4017a1965fad5e1ee690625749a1c72a2b0c524e4286a0f34a7ec6c233f6" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.545431 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.545723 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.545913 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g68wg\" (UniqueName: \"kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.647639 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g68wg\" (UniqueName: \"kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.647788 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.647881 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.648363 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.648481 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.678237 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g68wg\" (UniqueName: \"kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: I1206 05:53:17.765596 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: E1206 05:53:17.789826 4733 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(3dbaa200abb39c5f764f157a17968de783949791e141e7b8533562ece85bb50b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 05:53:17 crc kubenswrapper[4733]: E1206 05:53:17.789917 4733 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(3dbaa200abb39c5f764f157a17968de783949791e141e7b8533562ece85bb50b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: E1206 05:53:17.789943 4733 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(3dbaa200abb39c5f764f157a17968de783949791e141e7b8533562ece85bb50b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:17 crc kubenswrapper[4733]: E1206 05:53:17.790007 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace(1537e842-7b52-4886-81ea-989848fc3407)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace(1537e842-7b52-4886-81ea-989848fc3407)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(3dbaa200abb39c5f764f157a17968de783949791e141e7b8533562ece85bb50b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" podUID="1537e842-7b52-4886-81ea-989848fc3407" Dec 06 05:53:18 crc kubenswrapper[4733]: I1206 05:53:18.494440 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/2.log" Dec 06 05:53:18 crc kubenswrapper[4733]: I1206 05:53:18.494859 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/1.log" Dec 06 05:53:18 crc kubenswrapper[4733]: I1206 05:53:18.494927 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-684r5" event={"ID":"cc59542d-ee4a-414d-b096-86716cb56db5","Type":"ContainerStarted","Data":"c7fb8b8439c37a392335a652ce6bc136b90c4ddfe0ca6a6ce0ba1532963176a0"} Dec 06 05:53:18 crc kubenswrapper[4733]: I1206 05:53:18.494967 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:18 crc kubenswrapper[4733]: I1206 05:53:18.495548 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:18 crc kubenswrapper[4733]: E1206 05:53:18.518960 4733 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(c14d8667d954e0fdd0353e8aa8081d736a2a2b1b795b6da882691591667eca2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 05:53:18 crc kubenswrapper[4733]: E1206 05:53:18.519013 4733 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(c14d8667d954e0fdd0353e8aa8081d736a2a2b1b795b6da882691591667eca2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:18 crc kubenswrapper[4733]: E1206 05:53:18.519034 4733 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(c14d8667d954e0fdd0353e8aa8081d736a2a2b1b795b6da882691591667eca2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:18 crc kubenswrapper[4733]: E1206 05:53:18.519087 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace(1537e842-7b52-4886-81ea-989848fc3407)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace(1537e842-7b52-4886-81ea-989848fc3407)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_openshift-marketplace_1537e842-7b52-4886-81ea-989848fc3407_0(c14d8667d954e0fdd0353e8aa8081d736a2a2b1b795b6da882691591667eca2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" podUID="1537e842-7b52-4886-81ea-989848fc3407" Dec 06 05:53:22 crc kubenswrapper[4733]: I1206 05:53:22.417523 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkmks" Dec 06 05:53:29 crc kubenswrapper[4733]: I1206 05:53:29.484528 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:29 crc kubenswrapper[4733]: I1206 05:53:29.485331 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:29 crc kubenswrapper[4733]: I1206 05:53:29.857431 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx"] Dec 06 05:53:30 crc kubenswrapper[4733]: I1206 05:53:30.561877 4733 generic.go:334] "Generic (PLEG): container finished" podID="1537e842-7b52-4886-81ea-989848fc3407" containerID="95eba5a3d7660eeaf269f7444ec61033b2747e1510f5654928529584f87a1835" exitCode=0 Dec 06 05:53:30 crc kubenswrapper[4733]: I1206 05:53:30.561997 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" event={"ID":"1537e842-7b52-4886-81ea-989848fc3407","Type":"ContainerDied","Data":"95eba5a3d7660eeaf269f7444ec61033b2747e1510f5654928529584f87a1835"} Dec 06 05:53:30 crc kubenswrapper[4733]: I1206 05:53:30.562362 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" event={"ID":"1537e842-7b52-4886-81ea-989848fc3407","Type":"ContainerStarted","Data":"1ba207e856aaf572e5bd51ba6771ad132f308c1b2e37f63fd4e1298e6c6efd83"} Dec 06 05:53:32 crc kubenswrapper[4733]: I1206 05:53:32.575972 4733 generic.go:334] "Generic (PLEG): container finished" podID="1537e842-7b52-4886-81ea-989848fc3407" containerID="c8d022a66dd599b2f9cbae81af9e0602054686a3700057e9ec49a414dce6bd53" exitCode=0 Dec 06 05:53:32 crc kubenswrapper[4733]: I1206 05:53:32.576056 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" event={"ID":"1537e842-7b52-4886-81ea-989848fc3407","Type":"ContainerDied","Data":"c8d022a66dd599b2f9cbae81af9e0602054686a3700057e9ec49a414dce6bd53"} Dec 06 05:53:33 crc kubenswrapper[4733]: I1206 05:53:33.584990 4733 generic.go:334] "Generic (PLEG): container finished" podID="1537e842-7b52-4886-81ea-989848fc3407" containerID="f0dbfbb664502d6d1bed45509f3428fd7669839cb817935955bb7683700ff0de" exitCode=0 Dec 06 05:53:33 crc kubenswrapper[4733]: I1206 05:53:33.585099 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" event={"ID":"1537e842-7b52-4886-81ea-989848fc3407","Type":"ContainerDied","Data":"f0dbfbb664502d6d1bed45509f3428fd7669839cb817935955bb7683700ff0de"} Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.782772 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.931960 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util\") pod \"1537e842-7b52-4886-81ea-989848fc3407\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.932010 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g68wg\" (UniqueName: \"kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg\") pod \"1537e842-7b52-4886-81ea-989848fc3407\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.932047 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle\") pod \"1537e842-7b52-4886-81ea-989848fc3407\" (UID: \"1537e842-7b52-4886-81ea-989848fc3407\") " Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.933237 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle" (OuterVolumeSpecName: "bundle") pod "1537e842-7b52-4886-81ea-989848fc3407" (UID: "1537e842-7b52-4886-81ea-989848fc3407"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.937978 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg" (OuterVolumeSpecName: "kube-api-access-g68wg") pod "1537e842-7b52-4886-81ea-989848fc3407" (UID: "1537e842-7b52-4886-81ea-989848fc3407"). InnerVolumeSpecName "kube-api-access-g68wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:53:34 crc kubenswrapper[4733]: I1206 05:53:34.943266 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util" (OuterVolumeSpecName: "util") pod "1537e842-7b52-4886-81ea-989848fc3407" (UID: "1537e842-7b52-4886-81ea-989848fc3407"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.033133 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-util\") on node \"crc\" DevicePath \"\"" Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.033166 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g68wg\" (UniqueName: \"kubernetes.io/projected/1537e842-7b52-4886-81ea-989848fc3407-kube-api-access-g68wg\") on node \"crc\" DevicePath \"\"" Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.033233 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1537e842-7b52-4886-81ea-989848fc3407-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.597850 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" event={"ID":"1537e842-7b52-4886-81ea-989848fc3407","Type":"ContainerDied","Data":"1ba207e856aaf572e5bd51ba6771ad132f308c1b2e37f63fd4e1298e6c6efd83"} Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.597897 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba207e856aaf572e5bd51ba6771ad132f308c1b2e37f63fd4e1298e6c6efd83" Dec 06 05:53:35 crc kubenswrapper[4733]: I1206 05:53:35.597968 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx" Dec 06 05:53:36 crc kubenswrapper[4733]: I1206 05:53:36.766495 4733 scope.go:117] "RemoveContainer" containerID="238d1b3c645ca54e851f02ddb12c90bfcd039e6973993a7693cc9520d5268496" Dec 06 05:53:37 crc kubenswrapper[4733]: I1206 05:53:37.610354 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-684r5_cc59542d-ee4a-414d-b096-86716cb56db5/kube-multus/2.log" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.004272 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj"] Dec 06 05:53:39 crc kubenswrapper[4733]: E1206 05:53:39.004503 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="util" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.004516 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="util" Dec 06 05:53:39 crc kubenswrapper[4733]: E1206 05:53:39.004533 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="extract" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.004538 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="extract" Dec 06 05:53:39 crc kubenswrapper[4733]: E1206 05:53:39.004549 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="pull" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.004554 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="pull" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.004642 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1537e842-7b52-4886-81ea-989848fc3407" containerName="extract" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.005026 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.006932 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.007514 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7nhsv" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.008561 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.021535 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj"] Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.178739 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5tn\" (UniqueName: \"kubernetes.io/projected/e54c5e3d-c07f-408d-85a8-83eee0ccfc79-kube-api-access-tf5tn\") pod \"nmstate-operator-5b5b58f5c8-98ngj\" (UID: \"e54c5e3d-c07f-408d-85a8-83eee0ccfc79\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.280009 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf5tn\" (UniqueName: \"kubernetes.io/projected/e54c5e3d-c07f-408d-85a8-83eee0ccfc79-kube-api-access-tf5tn\") pod \"nmstate-operator-5b5b58f5c8-98ngj\" (UID: \"e54c5e3d-c07f-408d-85a8-83eee0ccfc79\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.297544 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf5tn\" (UniqueName: \"kubernetes.io/projected/e54c5e3d-c07f-408d-85a8-83eee0ccfc79-kube-api-access-tf5tn\") pod \"nmstate-operator-5b5b58f5c8-98ngj\" (UID: \"e54c5e3d-c07f-408d-85a8-83eee0ccfc79\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.316660 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" Dec 06 05:53:39 crc kubenswrapper[4733]: I1206 05:53:39.693484 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj"] Dec 06 05:53:39 crc kubenswrapper[4733]: W1206 05:53:39.701801 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54c5e3d_c07f_408d_85a8_83eee0ccfc79.slice/crio-9ff64b915269ce8a5d53eae9b2c2eb6231f6797716716c762cc365961ccdad1f WatchSource:0}: Error finding container 9ff64b915269ce8a5d53eae9b2c2eb6231f6797716716c762cc365961ccdad1f: Status 404 returned error can't find the container with id 9ff64b915269ce8a5d53eae9b2c2eb6231f6797716716c762cc365961ccdad1f Dec 06 05:53:40 crc kubenswrapper[4733]: I1206 05:53:40.631320 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" event={"ID":"e54c5e3d-c07f-408d-85a8-83eee0ccfc79","Type":"ContainerStarted","Data":"9ff64b915269ce8a5d53eae9b2c2eb6231f6797716716c762cc365961ccdad1f"} Dec 06 05:53:42 crc kubenswrapper[4733]: I1206 05:53:42.645893 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" event={"ID":"e54c5e3d-c07f-408d-85a8-83eee0ccfc79","Type":"ContainerStarted","Data":"5c224744b47fefdfee026a1f29a9a447d903ae76cc97af3df124bffd812449be"} Dec 06 05:53:42 crc kubenswrapper[4733]: I1206 05:53:42.662400 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-98ngj" podStartSLOduration=2.558672363 podStartE2EDuration="4.662386479s" podCreationTimestamp="2025-12-06 05:53:38 +0000 UTC" firstStartedPulling="2025-12-06 05:53:39.703938161 +0000 UTC m=+603.569149272" lastFinishedPulling="2025-12-06 05:53:41.807652287 +0000 UTC m=+605.672863388" observedRunningTime="2025-12-06 05:53:42.65910477 +0000 UTC m=+606.524315881" watchObservedRunningTime="2025-12-06 05:53:42.662386479 +0000 UTC m=+606.527597590" Dec 06 05:53:42 crc kubenswrapper[4733]: I1206 05:53:42.989065 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:53:42 crc kubenswrapper[4733]: I1206 05:53:42.989137 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.874055 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.875148 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.876796 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wvqxq" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.887081 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.892988 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.894924 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.896460 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lwn4x"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.897089 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.897387 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.918701 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.971145 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.971912 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.974677 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.974801 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-h9lm5" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.975297 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.988469 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc"] Dec 06 05:53:47 crc kubenswrapper[4733]: I1206 05:53:47.995017 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq7n\" (UniqueName: \"kubernetes.io/projected/4b3f3780-1188-4d8c-b369-f01efb0060ae-kube-api-access-wkq7n\") pod \"nmstate-metrics-7f946cbc9-qbfxz\" (UID: \"4b3f3780-1188-4d8c-b369-f01efb0060ae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.095968 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmsxd\" (UniqueName: \"kubernetes.io/projected/a88cd06a-4aaf-4fcd-984e-9839be379e86-kube-api-access-kmsxd\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096028 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9574c53f-a694-4fe7-a2f3-c2292bf727c1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096075 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq7n\" (UniqueName: \"kubernetes.io/projected/4b3f3780-1188-4d8c-b369-f01efb0060ae-kube-api-access-wkq7n\") pod \"nmstate-metrics-7f946cbc9-qbfxz\" (UID: \"4b3f3780-1188-4d8c-b369-f01efb0060ae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096362 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-nmstate-lock\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096392 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9574c53f-a694-4fe7-a2f3-c2292bf727c1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096420 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gsr2\" (UniqueName: \"kubernetes.io/projected/9574c53f-a694-4fe7-a2f3-c2292bf727c1-kube-api-access-2gsr2\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096456 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096481 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6cm\" (UniqueName: \"kubernetes.io/projected/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-kube-api-access-9g6cm\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096511 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-dbus-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.096535 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-ovs-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.117136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq7n\" (UniqueName: \"kubernetes.io/projected/4b3f3780-1188-4d8c-b369-f01efb0060ae-kube-api-access-wkq7n\") pod \"nmstate-metrics-7f946cbc9-qbfxz\" (UID: \"4b3f3780-1188-4d8c-b369-f01efb0060ae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.170574 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64f7cd9bf9-5f2g2"] Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.171297 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.182898 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64f7cd9bf9-5f2g2"] Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.187616 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200559 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-oauth-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200613 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200645 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9574c53f-a694-4fe7-a2f3-c2292bf727c1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200667 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-trusted-ca-bundle\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200827 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-nmstate-lock\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200901 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9574c53f-a694-4fe7-a2f3-c2292bf727c1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.200962 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gsr2\" (UniqueName: \"kubernetes.io/projected/9574c53f-a694-4fe7-a2f3-c2292bf727c1-kube-api-access-2gsr2\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201053 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201123 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6cm\" (UniqueName: \"kubernetes.io/projected/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-kube-api-access-9g6cm\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201197 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-oauth-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201261 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201359 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jdtz\" (UniqueName: \"kubernetes.io/projected/09c3616e-19c4-452c-bf98-beceb8b8ed42-kube-api-access-8jdtz\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201133 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-nmstate-lock\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201574 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-dbus-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201801 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-dbus-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201804 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-ovs-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201849 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a88cd06a-4aaf-4fcd-984e-9839be379e86-ovs-socket\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201917 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmsxd\" (UniqueName: \"kubernetes.io/projected/a88cd06a-4aaf-4fcd-984e-9839be379e86-kube-api-access-kmsxd\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.201959 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-service-ca\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.206889 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.207412 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9574c53f-a694-4fe7-a2f3-c2292bf727c1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.212993 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9574c53f-a694-4fe7-a2f3-c2292bf727c1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.221757 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gsr2\" (UniqueName: \"kubernetes.io/projected/9574c53f-a694-4fe7-a2f3-c2292bf727c1-kube-api-access-2gsr2\") pod \"nmstate-console-plugin-7fbb5f6569-8rcsc\" (UID: \"9574c53f-a694-4fe7-a2f3-c2292bf727c1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.230823 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmsxd\" (UniqueName: \"kubernetes.io/projected/a88cd06a-4aaf-4fcd-984e-9839be379e86-kube-api-access-kmsxd\") pod \"nmstate-handler-lwn4x\" (UID: \"a88cd06a-4aaf-4fcd-984e-9839be379e86\") " pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.234088 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6cm\" (UniqueName: \"kubernetes.io/projected/20f6b3c1-8c56-4603-8efc-c5aa7e3420cb-kube-api-access-9g6cm\") pod \"nmstate-webhook-5f6d4c5ccb-gw2n8\" (UID: \"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.286467 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.306927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-oauth-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307597 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307629 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-trusted-ca-bundle\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307685 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-oauth-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307704 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jdtz\" (UniqueName: \"kubernetes.io/projected/09c3616e-19c4-452c-bf98-beceb8b8ed42-kube-api-access-8jdtz\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.307788 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-service-ca\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.308583 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-service-ca\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.308813 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-oauth-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.310002 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.310428 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c3616e-19c4-452c-bf98-beceb8b8ed42-trusted-ca-bundle\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.311267 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-oauth-config\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.317247 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3616e-19c4-452c-bf98-beceb8b8ed42-console-serving-cert\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.327499 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jdtz\" (UniqueName: \"kubernetes.io/projected/09c3616e-19c4-452c-bf98-beceb8b8ed42-kube-api-access-8jdtz\") pod \"console-64f7cd9bf9-5f2g2\" (UID: \"09c3616e-19c4-452c-bf98-beceb8b8ed42\") " pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.344033 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz"] Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.440175 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc"] Dec 06 05:53:48 crc kubenswrapper[4733]: W1206 05:53:48.441219 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9574c53f_a694_4fe7_a2f3_c2292bf727c1.slice/crio-eeda78ddcf9a779bdfcb88c86ebc9ad391c7586da208fa211d2e4ff7b8260c65 WatchSource:0}: Error finding container eeda78ddcf9a779bdfcb88c86ebc9ad391c7586da208fa211d2e4ff7b8260c65: Status 404 returned error can't find the container with id eeda78ddcf9a779bdfcb88c86ebc9ad391c7586da208fa211d2e4ff7b8260c65 Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.483243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.509241 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.514999 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:48 crc kubenswrapper[4733]: W1206 05:53:48.547852 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88cd06a_4aaf_4fcd_984e_9839be379e86.slice/crio-7d41851b3dfbf5944d965ba4bcfa0ba4163199c9c9af9acbe03b9b2678a89fb0 WatchSource:0}: Error finding container 7d41851b3dfbf5944d965ba4bcfa0ba4163199c9c9af9acbe03b9b2678a89fb0: Status 404 returned error can't find the container with id 7d41851b3dfbf5944d965ba4bcfa0ba4163199c9c9af9acbe03b9b2678a89fb0 Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.669753 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64f7cd9bf9-5f2g2"] Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.678878 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" event={"ID":"9574c53f-a694-4fe7-a2f3-c2292bf727c1","Type":"ContainerStarted","Data":"eeda78ddcf9a779bdfcb88c86ebc9ad391c7586da208fa211d2e4ff7b8260c65"} Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.680436 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" event={"ID":"4b3f3780-1188-4d8c-b369-f01efb0060ae","Type":"ContainerStarted","Data":"f306f5dec504448a2359c2166bf42ddd825b476fda29f01822302ffc28fe85ec"} Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.681658 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lwn4x" event={"ID":"a88cd06a-4aaf-4fcd-984e-9839be379e86","Type":"ContainerStarted","Data":"7d41851b3dfbf5944d965ba4bcfa0ba4163199c9c9af9acbe03b9b2678a89fb0"} Dec 06 05:53:48 crc kubenswrapper[4733]: I1206 05:53:48.695701 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8"] Dec 06 05:53:48 crc kubenswrapper[4733]: W1206 05:53:48.701011 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f6b3c1_8c56_4603_8efc_c5aa7e3420cb.slice/crio-d5630aae664915efba55c29af48500c3174a6712327be33a175965b51067da14 WatchSource:0}: Error finding container d5630aae664915efba55c29af48500c3174a6712327be33a175965b51067da14: Status 404 returned error can't find the container with id d5630aae664915efba55c29af48500c3174a6712327be33a175965b51067da14 Dec 06 05:53:49 crc kubenswrapper[4733]: I1206 05:53:49.690641 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" event={"ID":"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb","Type":"ContainerStarted","Data":"d5630aae664915efba55c29af48500c3174a6712327be33a175965b51067da14"} Dec 06 05:53:49 crc kubenswrapper[4733]: I1206 05:53:49.692701 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64f7cd9bf9-5f2g2" event={"ID":"09c3616e-19c4-452c-bf98-beceb8b8ed42","Type":"ContainerStarted","Data":"d8f8f178689c402c939b022d552e4047a0b570d405904070b9b4ff8ee2079b64"} Dec 06 05:53:49 crc kubenswrapper[4733]: I1206 05:53:49.692808 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64f7cd9bf9-5f2g2" event={"ID":"09c3616e-19c4-452c-bf98-beceb8b8ed42","Type":"ContainerStarted","Data":"4980cf55070973b3ff128602398a8a9322197300eadbda51b8781a3a900098d3"} Dec 06 05:53:49 crc kubenswrapper[4733]: I1206 05:53:49.713505 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64f7cd9bf9-5f2g2" podStartSLOduration=1.7134845730000001 podStartE2EDuration="1.713484573s" podCreationTimestamp="2025-12-06 05:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:53:49.708812969 +0000 UTC m=+613.574024070" watchObservedRunningTime="2025-12-06 05:53:49.713484573 +0000 UTC m=+613.578695684" Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.706604 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" event={"ID":"9574c53f-a694-4fe7-a2f3-c2292bf727c1","Type":"ContainerStarted","Data":"01fc118a216d52942872ad891be297dd9245152b9dd22387c67ce2c33cbe845f"} Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.708899 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lwn4x" event={"ID":"a88cd06a-4aaf-4fcd-984e-9839be379e86","Type":"ContainerStarted","Data":"0a3a20664acf2d0bebaf72651e8031217594ed0fc015bccc7c47ab24eea27903"} Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.709046 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.710803 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" event={"ID":"20f6b3c1-8c56-4603-8efc-c5aa7e3420cb","Type":"ContainerStarted","Data":"fb15044a69fe476fcbdf139d87f8f94525b5d61088d9e99b08d5d47cda96a065"} Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.710961 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.712155 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" event={"ID":"4b3f3780-1188-4d8c-b369-f01efb0060ae","Type":"ContainerStarted","Data":"90179b0843ed55e6796828025aea451d6d4f25e75d9afaffc63fd0eb1eb78c68"} Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.724245 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8rcsc" podStartSLOduration=1.9591026679999999 podStartE2EDuration="4.724229284s" podCreationTimestamp="2025-12-06 05:53:47 +0000 UTC" firstStartedPulling="2025-12-06 05:53:48.443833362 +0000 UTC m=+612.309044473" lastFinishedPulling="2025-12-06 05:53:51.208959977 +0000 UTC m=+615.074171089" observedRunningTime="2025-12-06 05:53:51.722480574 +0000 UTC m=+615.587691686" watchObservedRunningTime="2025-12-06 05:53:51.724229284 +0000 UTC m=+615.589440395" Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.747350 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" podStartSLOduration=2.218805219 podStartE2EDuration="4.747314367s" podCreationTimestamp="2025-12-06 05:53:47 +0000 UTC" firstStartedPulling="2025-12-06 05:53:48.703018871 +0000 UTC m=+612.568229982" lastFinishedPulling="2025-12-06 05:53:51.231528018 +0000 UTC m=+615.096739130" observedRunningTime="2025-12-06 05:53:51.741629008 +0000 UTC m=+615.606840119" watchObservedRunningTime="2025-12-06 05:53:51.747314367 +0000 UTC m=+615.612525478" Dec 06 05:53:51 crc kubenswrapper[4733]: I1206 05:53:51.754854 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lwn4x" podStartSLOduration=2.076358929 podStartE2EDuration="4.754835658s" podCreationTimestamp="2025-12-06 05:53:47 +0000 UTC" firstStartedPulling="2025-12-06 05:53:48.551010622 +0000 UTC m=+612.416221734" lastFinishedPulling="2025-12-06 05:53:51.229487352 +0000 UTC m=+615.094698463" observedRunningTime="2025-12-06 05:53:51.753439703 +0000 UTC m=+615.618650815" watchObservedRunningTime="2025-12-06 05:53:51.754835658 +0000 UTC m=+615.620046769" Dec 06 05:53:53 crc kubenswrapper[4733]: I1206 05:53:53.724200 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" event={"ID":"4b3f3780-1188-4d8c-b369-f01efb0060ae","Type":"ContainerStarted","Data":"20d326f0ec1624ec2b737089d5e71a3144311e74683f07c9efc7bc771acbd080"} Dec 06 05:53:53 crc kubenswrapper[4733]: I1206 05:53:53.737174 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbfxz" podStartSLOduration=1.4911544939999999 podStartE2EDuration="6.737158982s" podCreationTimestamp="2025-12-06 05:53:47 +0000 UTC" firstStartedPulling="2025-12-06 05:53:48.348769578 +0000 UTC m=+612.213980689" lastFinishedPulling="2025-12-06 05:53:53.594774066 +0000 UTC m=+617.459985177" observedRunningTime="2025-12-06 05:53:53.734593078 +0000 UTC m=+617.599804178" watchObservedRunningTime="2025-12-06 05:53:53.737158982 +0000 UTC m=+617.602370093" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.483514 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.491648 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.491769 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.540232 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lwn4x" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.756415 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64f7cd9bf9-5f2g2" Dec 06 05:53:58 crc kubenswrapper[4733]: I1206 05:53:58.793318 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:54:08 crc kubenswrapper[4733]: I1206 05:54:08.514040 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gw2n8" Dec 06 05:54:12 crc kubenswrapper[4733]: I1206 05:54:12.989377 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:54:12 crc kubenswrapper[4733]: I1206 05:54:12.989850 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:54:12 crc kubenswrapper[4733]: I1206 05:54:12.989894 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:54:12 crc kubenswrapper[4733]: I1206 05:54:12.990366 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 05:54:12 crc kubenswrapper[4733]: I1206 05:54:12.990416 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10" gracePeriod=600 Dec 06 05:54:13 crc kubenswrapper[4733]: I1206 05:54:13.849630 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10" exitCode=0 Dec 06 05:54:13 crc kubenswrapper[4733]: I1206 05:54:13.850187 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10"} Dec 06 05:54:13 crc kubenswrapper[4733]: I1206 05:54:13.850221 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0"} Dec 06 05:54:13 crc kubenswrapper[4733]: I1206 05:54:13.850241 4733 scope.go:117] "RemoveContainer" containerID="50470b50bca695b7d51dc24f892cb10e96f186fcba10fdad5ebd2bd169d01d77" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.677787 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g"] Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.679274 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.680917 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.686858 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g"] Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.766264 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.766347 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.766582 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx6rh\" (UniqueName: \"kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.867621 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.867687 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.867766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx6rh\" (UniqueName: \"kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.868380 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.868441 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.889761 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx6rh\" (UniqueName: \"kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:18 crc kubenswrapper[4733]: I1206 05:54:18.993166 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:19 crc kubenswrapper[4733]: I1206 05:54:19.380421 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g"] Dec 06 05:54:19 crc kubenswrapper[4733]: I1206 05:54:19.896998 4733 generic.go:334] "Generic (PLEG): container finished" podID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerID="874e28d16d4ec9a025bcedaf9df8d7e3d85129f165644c849bc4cc873751fbc0" exitCode=0 Dec 06 05:54:19 crc kubenswrapper[4733]: I1206 05:54:19.897120 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" event={"ID":"cbcb89dd-b5ba-4b72-9a34-24048c6b7275","Type":"ContainerDied","Data":"874e28d16d4ec9a025bcedaf9df8d7e3d85129f165644c849bc4cc873751fbc0"} Dec 06 05:54:19 crc kubenswrapper[4733]: I1206 05:54:19.897467 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" event={"ID":"cbcb89dd-b5ba-4b72-9a34-24048c6b7275","Type":"ContainerStarted","Data":"79d906760d7e3bd440ea3b116c195ebc0b1775ee8f4a4d8b00378106f7141474"} Dec 06 05:54:21 crc kubenswrapper[4733]: I1206 05:54:21.912051 4733 generic.go:334] "Generic (PLEG): container finished" podID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerID="38cfaf5e83e9d058123b7a2eefc8ef428ba5a99a8d7a2382985418daf241e74a" exitCode=0 Dec 06 05:54:21 crc kubenswrapper[4733]: I1206 05:54:21.912146 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" event={"ID":"cbcb89dd-b5ba-4b72-9a34-24048c6b7275","Type":"ContainerDied","Data":"38cfaf5e83e9d058123b7a2eefc8ef428ba5a99a8d7a2382985418daf241e74a"} Dec 06 05:54:22 crc kubenswrapper[4733]: I1206 05:54:22.928699 4733 generic.go:334] "Generic (PLEG): container finished" podID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerID="6f576639065bd8f6dc1582a8c78a5b1ef5ba0281835cd0ed6aa7ec3c4e002bd5" exitCode=0 Dec 06 05:54:22 crc kubenswrapper[4733]: I1206 05:54:22.928743 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" event={"ID":"cbcb89dd-b5ba-4b72-9a34-24048c6b7275","Type":"ContainerDied","Data":"6f576639065bd8f6dc1582a8c78a5b1ef5ba0281835cd0ed6aa7ec3c4e002bd5"} Dec 06 05:54:23 crc kubenswrapper[4733]: I1206 05:54:23.820114 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tbqkj" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerName="console" containerID="cri-o://3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8" gracePeriod=15 Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.141565 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.171798 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tbqkj_17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832/console/0.log" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.171877 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341463 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util\") pod \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341517 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341560 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341596 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341625 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle\") pod \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341662 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd2nf\" (UniqueName: \"kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341756 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx6rh\" (UniqueName: \"kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh\") pod \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\" (UID: \"cbcb89dd-b5ba-4b72-9a34-24048c6b7275\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341775 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341814 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.341871 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert\") pod \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\" (UID: \"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832\") " Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.342900 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca" (OuterVolumeSpecName: "service-ca") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.343110 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.343183 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config" (OuterVolumeSpecName: "console-config") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.343289 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.343687 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle" (OuterVolumeSpecName: "bundle") pod "cbcb89dd-b5ba-4b72-9a34-24048c6b7275" (UID: "cbcb89dd-b5ba-4b72-9a34-24048c6b7275"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.348588 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh" (OuterVolumeSpecName: "kube-api-access-cx6rh") pod "cbcb89dd-b5ba-4b72-9a34-24048c6b7275" (UID: "cbcb89dd-b5ba-4b72-9a34-24048c6b7275"). InnerVolumeSpecName "kube-api-access-cx6rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.348955 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf" (OuterVolumeSpecName: "kube-api-access-sd2nf") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "kube-api-access-sd2nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.348968 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.349051 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" (UID: "17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.352297 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util" (OuterVolumeSpecName: "util") pod "cbcb89dd-b5ba-4b72-9a34-24048c6b7275" (UID: "cbcb89dd-b5ba-4b72-9a34-24048c6b7275"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442917 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442943 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx6rh\" (UniqueName: \"kubernetes.io/projected/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-kube-api-access-cx6rh\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442956 4733 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442967 4733 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442978 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-util\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442986 4733 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.442994 4733 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.443003 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.443011 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbcb89dd-b5ba-4b72-9a34-24048c6b7275-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.443019 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd2nf\" (UniqueName: \"kubernetes.io/projected/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832-kube-api-access-sd2nf\") on node \"crc\" DevicePath \"\"" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943189 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tbqkj_17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832/console/0.log" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943270 4733 generic.go:334] "Generic (PLEG): container finished" podID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerID="3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8" exitCode=2 Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943380 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tbqkj" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943381 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tbqkj" event={"ID":"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832","Type":"ContainerDied","Data":"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8"} Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943530 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tbqkj" event={"ID":"17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832","Type":"ContainerDied","Data":"5bdfd9c5bfd805594e5003a7d4a847af823720ca7c0543d466a08d1270d93b33"} Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.943569 4733 scope.go:117] "RemoveContainer" containerID="3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.946151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" event={"ID":"cbcb89dd-b5ba-4b72-9a34-24048c6b7275","Type":"ContainerDied","Data":"79d906760d7e3bd440ea3b116c195ebc0b1775ee8f4a4d8b00378106f7141474"} Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.946217 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d906760d7e3bd440ea3b116c195ebc0b1775ee8f4a4d8b00378106f7141474" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.946243 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.963283 4733 scope.go:117] "RemoveContainer" containerID="3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8" Dec 06 05:54:24 crc kubenswrapper[4733]: E1206 05:54:24.963788 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8\": container with ID starting with 3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8 not found: ID does not exist" containerID="3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.963828 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.963843 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8"} err="failed to get container status \"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8\": rpc error: code = NotFound desc = could not find container \"3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8\": container with ID starting with 3aeb61aef6c38a5adfd5e8a912e1b5c08eecd8198c12bb2fc69154b083c9fba8 not found: ID does not exist" Dec 06 05:54:24 crc kubenswrapper[4733]: I1206 05:54:24.970970 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tbqkj"] Dec 06 05:54:26 crc kubenswrapper[4733]: I1206 05:54:26.493318 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" path="/var/lib/kubelet/pods/17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832/volumes" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.363745 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6"] Dec 06 05:54:34 crc kubenswrapper[4733]: E1206 05:54:34.364534 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerName="console" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364549 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerName="console" Dec 06 05:54:34 crc kubenswrapper[4733]: E1206 05:54:34.364564 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="pull" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364571 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="pull" Dec 06 05:54:34 crc kubenswrapper[4733]: E1206 05:54:34.364581 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="util" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364586 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="util" Dec 06 05:54:34 crc kubenswrapper[4733]: E1206 05:54:34.364612 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="extract" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364618 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="extract" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364715 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ed3dff-9a5f-4816-9d3f-9eaf1d7f5832" containerName="console" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.364726 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcb89dd-b5ba-4b72-9a34-24048c6b7275" containerName="extract" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.365152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.367559 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.367978 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.368075 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6h429" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.368111 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.368690 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.374099 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-apiservice-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.374144 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg89\" (UniqueName: \"kubernetes.io/projected/652ac185-5c87-4354-9f1d-0c103702a926-kube-api-access-mjg89\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.374175 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-webhook-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.380028 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6"] Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.475579 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-apiservice-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.475635 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg89\" (UniqueName: \"kubernetes.io/projected/652ac185-5c87-4354-9f1d-0c103702a926-kube-api-access-mjg89\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.475675 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-webhook-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.483094 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-apiservice-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.483116 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/652ac185-5c87-4354-9f1d-0c103702a926-webhook-cert\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.492011 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg89\" (UniqueName: \"kubernetes.io/projected/652ac185-5c87-4354-9f1d-0c103702a926-kube-api-access-mjg89\") pod \"metallb-operator-controller-manager-58dbb79b86-frmc6\" (UID: \"652ac185-5c87-4354-9f1d-0c103702a926\") " pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.587096 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6"] Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.587838 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.589665 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.589668 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.589987 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-29pzm" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.601194 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6"] Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.678946 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-apiservice-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.678991 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-webhook-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.679031 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6k9n\" (UniqueName: \"kubernetes.io/projected/ad526f45-2ff7-4236-8ed5-161860544782-kube-api-access-c6k9n\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.680570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.780806 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-apiservice-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.780972 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-webhook-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.781008 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6k9n\" (UniqueName: \"kubernetes.io/projected/ad526f45-2ff7-4236-8ed5-161860544782-kube-api-access-c6k9n\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.793526 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-webhook-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.794489 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6k9n\" (UniqueName: \"kubernetes.io/projected/ad526f45-2ff7-4236-8ed5-161860544782-kube-api-access-c6k9n\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.798443 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad526f45-2ff7-4236-8ed5-161860544782-apiservice-cert\") pod \"metallb-operator-webhook-server-69cb7d5cf9-6jpn6\" (UID: \"ad526f45-2ff7-4236-8ed5-161860544782\") " pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:34 crc kubenswrapper[4733]: I1206 05:54:34.900337 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:35 crc kubenswrapper[4733]: I1206 05:54:35.063962 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6"] Dec 06 05:54:35 crc kubenswrapper[4733]: W1206 05:54:35.074580 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652ac185_5c87_4354_9f1d_0c103702a926.slice/crio-f62defcbd200a198eec13af80828894793d37f05569e98f8e55603480832a292 WatchSource:0}: Error finding container f62defcbd200a198eec13af80828894793d37f05569e98f8e55603480832a292: Status 404 returned error can't find the container with id f62defcbd200a198eec13af80828894793d37f05569e98f8e55603480832a292 Dec 06 05:54:35 crc kubenswrapper[4733]: I1206 05:54:35.122614 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6"] Dec 06 05:54:35 crc kubenswrapper[4733]: W1206 05:54:35.125340 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad526f45_2ff7_4236_8ed5_161860544782.slice/crio-b175b0ccef47d49d00fe04d490c9efcbbbbec645a9226893c093f8d86bf18745 WatchSource:0}: Error finding container b175b0ccef47d49d00fe04d490c9efcbbbbec645a9226893c093f8d86bf18745: Status 404 returned error can't find the container with id b175b0ccef47d49d00fe04d490c9efcbbbbec645a9226893c093f8d86bf18745 Dec 06 05:54:36 crc kubenswrapper[4733]: I1206 05:54:36.009894 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" event={"ID":"652ac185-5c87-4354-9f1d-0c103702a926","Type":"ContainerStarted","Data":"f62defcbd200a198eec13af80828894793d37f05569e98f8e55603480832a292"} Dec 06 05:54:36 crc kubenswrapper[4733]: I1206 05:54:36.011894 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" event={"ID":"ad526f45-2ff7-4236-8ed5-161860544782","Type":"ContainerStarted","Data":"b175b0ccef47d49d00fe04d490c9efcbbbbec645a9226893c093f8d86bf18745"} Dec 06 05:54:38 crc kubenswrapper[4733]: I1206 05:54:38.029052 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" event={"ID":"652ac185-5c87-4354-9f1d-0c103702a926","Type":"ContainerStarted","Data":"4c3635c53dc4056a1d0c00edcb5ebe81e2590baddeec68327411c8d6c3e98c7d"} Dec 06 05:54:38 crc kubenswrapper[4733]: I1206 05:54:38.029517 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:54:38 crc kubenswrapper[4733]: I1206 05:54:38.060614 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" podStartSLOduration=1.591287162 podStartE2EDuration="4.060594069s" podCreationTimestamp="2025-12-06 05:54:34 +0000 UTC" firstStartedPulling="2025-12-06 05:54:35.076863232 +0000 UTC m=+658.942074343" lastFinishedPulling="2025-12-06 05:54:37.54617014 +0000 UTC m=+661.411381250" observedRunningTime="2025-12-06 05:54:38.059889074 +0000 UTC m=+661.925100184" watchObservedRunningTime="2025-12-06 05:54:38.060594069 +0000 UTC m=+661.925805180" Dec 06 05:54:39 crc kubenswrapper[4733]: I1206 05:54:39.037424 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" event={"ID":"ad526f45-2ff7-4236-8ed5-161860544782","Type":"ContainerStarted","Data":"ae5c5f3be9d279c9f95b29dde1f417844d7d07e6c5f42940be3369fbaea31642"} Dec 06 05:54:39 crc kubenswrapper[4733]: I1206 05:54:39.056397 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" podStartSLOduration=1.361249624 podStartE2EDuration="5.05638267s" podCreationTimestamp="2025-12-06 05:54:34 +0000 UTC" firstStartedPulling="2025-12-06 05:54:35.12791784 +0000 UTC m=+658.993128952" lastFinishedPulling="2025-12-06 05:54:38.823050887 +0000 UTC m=+662.688261998" observedRunningTime="2025-12-06 05:54:39.054986594 +0000 UTC m=+662.920197696" watchObservedRunningTime="2025-12-06 05:54:39.05638267 +0000 UTC m=+662.921593781" Dec 06 05:54:40 crc kubenswrapper[4733]: I1206 05:54:40.044019 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:54:54 crc kubenswrapper[4733]: I1206 05:54:54.911350 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69cb7d5cf9-6jpn6" Dec 06 05:55:14 crc kubenswrapper[4733]: I1206 05:55:14.683662 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58dbb79b86-frmc6" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.231441 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8l7xm"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.234206 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.234733 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.235572 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.236109 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7wdw6" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.236266 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.236545 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.236590 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.263230 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.300218 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-l9xh2"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.301138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.303562 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.304192 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.305281 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5qsr2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.306987 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.315819 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-vkbjg"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.316716 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.317795 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.333983 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vkbjg"] Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430243 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzbmn\" (UniqueName: \"kubernetes.io/projected/cd3104a0-fb18-4c26-9049-19967b2d5060-kube-api-access-nzbmn\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430329 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-metrics-certs\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430360 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd3104a0-fb18-4c26-9049-19967b2d5060-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430519 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkxr\" (UniqueName: \"kubernetes.io/projected/bdd47a64-10b1-4dce-bec3-88d302bf60e7-kube-api-access-zbkxr\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430570 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-sockets\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430592 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-metrics-certs\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430687 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics-certs\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430752 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-conf\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430804 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z47s\" (UniqueName: \"kubernetes.io/projected/6361843f-1584-4663-840b-7891442d913f-kube-api-access-5z47s\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430858 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-cert\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430924 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-startup\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.430971 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6361843f-1584-4663-840b-7891442d913f-metallb-excludel2\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.431001 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.431047 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n97\" (UniqueName: \"kubernetes.io/projected/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-kube-api-access-d8n97\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.431182 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.431315 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-reloader\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532507 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd3104a0-fb18-4c26-9049-19967b2d5060-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-metrics-certs\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532606 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkxr\" (UniqueName: \"kubernetes.io/projected/bdd47a64-10b1-4dce-bec3-88d302bf60e7-kube-api-access-zbkxr\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-sockets\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532644 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-metrics-certs\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532674 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics-certs\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-conf\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532715 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z47s\" (UniqueName: \"kubernetes.io/projected/6361843f-1584-4663-840b-7891442d913f-kube-api-access-5z47s\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532743 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-cert\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532781 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-startup\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532798 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6361843f-1584-4663-840b-7891442d913f-metallb-excludel2\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532834 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8n97\" (UniqueName: \"kubernetes.io/projected/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-kube-api-access-d8n97\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532852 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532874 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-reloader\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.532892 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzbmn\" (UniqueName: \"kubernetes.io/projected/cd3104a0-fb18-4c26-9049-19967b2d5060-kube-api-access-nzbmn\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.533056 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-sockets\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.533436 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-conf\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: E1206 05:55:15.533605 4733 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 05:55:15 crc kubenswrapper[4733]: E1206 05:55:15.533658 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist podName:6361843f-1584-4663-840b-7891442d913f nodeName:}" failed. No retries permitted until 2025-12-06 05:55:16.033642005 +0000 UTC m=+699.898853117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist") pod "speaker-l9xh2" (UID: "6361843f-1584-4663-840b-7891442d913f") : secret "metallb-memberlist" not found Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.533853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.534020 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-reloader\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.534181 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6361843f-1584-4663-840b-7891442d913f-metallb-excludel2\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.534454 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-frr-startup\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.536699 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.538146 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-metrics-certs\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.538248 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-metrics-certs\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.538627 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-metrics-certs\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.538835 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd3104a0-fb18-4c26-9049-19967b2d5060-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.545873 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z47s\" (UniqueName: \"kubernetes.io/projected/6361843f-1584-4663-840b-7891442d913f-kube-api-access-5z47s\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.546091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzbmn\" (UniqueName: \"kubernetes.io/projected/cd3104a0-fb18-4c26-9049-19967b2d5060-kube-api-access-nzbmn\") pod \"frr-k8s-webhook-server-7fcb986d4-mqg7r\" (UID: \"cd3104a0-fb18-4c26-9049-19967b2d5060\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.546484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdd47a64-10b1-4dce-bec3-88d302bf60e7-cert\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.548388 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkxr\" (UniqueName: \"kubernetes.io/projected/bdd47a64-10b1-4dce-bec3-88d302bf60e7-kube-api-access-zbkxr\") pod \"controller-f8648f98b-vkbjg\" (UID: \"bdd47a64-10b1-4dce-bec3-88d302bf60e7\") " pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.548598 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8n97\" (UniqueName: \"kubernetes.io/projected/b6c5bf84-f86b-4a51-bd80-0a23163dd42b-kube-api-access-d8n97\") pod \"frr-k8s-8l7xm\" (UID: \"b6c5bf84-f86b-4a51-bd80-0a23163dd42b\") " pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.551629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.558491 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.627740 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:15 crc kubenswrapper[4733]: I1206 05:55:15.961684 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r"] Dec 06 05:55:15 crc kubenswrapper[4733]: W1206 05:55:15.964803 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3104a0_fb18_4c26_9049_19967b2d5060.slice/crio-34e0f5cfdb7b5a937050032bfb395c2892704b1fc0319d605a8af55429dac1fd WatchSource:0}: Error finding container 34e0f5cfdb7b5a937050032bfb395c2892704b1fc0319d605a8af55429dac1fd: Status 404 returned error can't find the container with id 34e0f5cfdb7b5a937050032bfb395c2892704b1fc0319d605a8af55429dac1fd Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.017168 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vkbjg"] Dec 06 05:55:16 crc kubenswrapper[4733]: W1206 05:55:16.017524 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd47a64_10b1_4dce_bec3_88d302bf60e7.slice/crio-6ac71a8cdf86403da812ea818c5eab6fe7f4e2e50c81b60add531ef57297793d WatchSource:0}: Error finding container 6ac71a8cdf86403da812ea818c5eab6fe7f4e2e50c81b60add531ef57297793d: Status 404 returned error can't find the container with id 6ac71a8cdf86403da812ea818c5eab6fe7f4e2e50c81b60add531ef57297793d Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.044010 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:16 crc kubenswrapper[4733]: E1206 05:55:16.044177 4733 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 05:55:16 crc kubenswrapper[4733]: E1206 05:55:16.044249 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist podName:6361843f-1584-4663-840b-7891442d913f nodeName:}" failed. No retries permitted until 2025-12-06 05:55:17.044233238 +0000 UTC m=+700.909444348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist") pod "speaker-l9xh2" (UID: "6361843f-1584-4663-840b-7891442d913f") : secret "metallb-memberlist" not found Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.264885 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"9bc9e9538d9302ec013f3ee483679dd2a970f89e0061b781e543875a121c2698"} Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.267421 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vkbjg" event={"ID":"bdd47a64-10b1-4dce-bec3-88d302bf60e7","Type":"ContainerStarted","Data":"f9943c4ece098053377968aa54090b4866259ad3d348f3765948773efe5dd687"} Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.267473 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vkbjg" event={"ID":"bdd47a64-10b1-4dce-bec3-88d302bf60e7","Type":"ContainerStarted","Data":"6cc577c29b2c140ffc7a5820afd7050a51da06408c2df6f40c3d66089380104a"} Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.267486 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vkbjg" event={"ID":"bdd47a64-10b1-4dce-bec3-88d302bf60e7","Type":"ContainerStarted","Data":"6ac71a8cdf86403da812ea818c5eab6fe7f4e2e50c81b60add531ef57297793d"} Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.267575 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.268673 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" event={"ID":"cd3104a0-fb18-4c26-9049-19967b2d5060","Type":"ContainerStarted","Data":"34e0f5cfdb7b5a937050032bfb395c2892704b1fc0319d605a8af55429dac1fd"} Dec 06 05:55:16 crc kubenswrapper[4733]: I1206 05:55:16.282694 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-vkbjg" podStartSLOduration=1.282673456 podStartE2EDuration="1.282673456s" podCreationTimestamp="2025-12-06 05:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:55:16.282422184 +0000 UTC m=+700.147633295" watchObservedRunningTime="2025-12-06 05:55:16.282673456 +0000 UTC m=+700.147884567" Dec 06 05:55:17 crc kubenswrapper[4733]: I1206 05:55:17.059194 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:17 crc kubenswrapper[4733]: I1206 05:55:17.066128 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6361843f-1584-4663-840b-7891442d913f-memberlist\") pod \"speaker-l9xh2\" (UID: \"6361843f-1584-4663-840b-7891442d913f\") " pod="metallb-system/speaker-l9xh2" Dec 06 05:55:17 crc kubenswrapper[4733]: I1206 05:55:17.114793 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l9xh2" Dec 06 05:55:17 crc kubenswrapper[4733]: W1206 05:55:17.136051 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6361843f_1584_4663_840b_7891442d913f.slice/crio-da77a6fc7ad5677059848429af7cb5a6e0c56883990d8da2fc708499c962d8e3 WatchSource:0}: Error finding container da77a6fc7ad5677059848429af7cb5a6e0c56883990d8da2fc708499c962d8e3: Status 404 returned error can't find the container with id da77a6fc7ad5677059848429af7cb5a6e0c56883990d8da2fc708499c962d8e3 Dec 06 05:55:17 crc kubenswrapper[4733]: I1206 05:55:17.278509 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l9xh2" event={"ID":"6361843f-1584-4663-840b-7891442d913f","Type":"ContainerStarted","Data":"da77a6fc7ad5677059848429af7cb5a6e0c56883990d8da2fc708499c962d8e3"} Dec 06 05:55:18 crc kubenswrapper[4733]: I1206 05:55:18.291436 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l9xh2" event={"ID":"6361843f-1584-4663-840b-7891442d913f","Type":"ContainerStarted","Data":"a08dc7c73e2dfab0a3ed96bb4748c8e42634902fd24b21d4b5176617150db73e"} Dec 06 05:55:18 crc kubenswrapper[4733]: I1206 05:55:18.291783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l9xh2" event={"ID":"6361843f-1584-4663-840b-7891442d913f","Type":"ContainerStarted","Data":"24bfa6a2528e01616506dd8629c87b7826d7a7b41ddd9225493ad60a29b842f2"} Dec 06 05:55:18 crc kubenswrapper[4733]: I1206 05:55:18.292866 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-l9xh2" Dec 06 05:55:18 crc kubenswrapper[4733]: I1206 05:55:18.310800 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-l9xh2" podStartSLOduration=3.310788739 podStartE2EDuration="3.310788739s" podCreationTimestamp="2025-12-06 05:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:55:18.30685411 +0000 UTC m=+702.172065221" watchObservedRunningTime="2025-12-06 05:55:18.310788739 +0000 UTC m=+702.175999850" Dec 06 05:55:22 crc kubenswrapper[4733]: I1206 05:55:22.327142 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" event={"ID":"cd3104a0-fb18-4c26-9049-19967b2d5060","Type":"ContainerStarted","Data":"0d2e99849b3a645fa9a67659b9c0d7c99a58f7d34c221dec8db6919158b46211"} Dec 06 05:55:22 crc kubenswrapper[4733]: I1206 05:55:22.327805 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:22 crc kubenswrapper[4733]: I1206 05:55:22.329063 4733 generic.go:334] "Generic (PLEG): container finished" podID="b6c5bf84-f86b-4a51-bd80-0a23163dd42b" containerID="d96af61b6408895927c8481e2b1790b5e103271e1f1d15282c3293c9c0ec7513" exitCode=0 Dec 06 05:55:22 crc kubenswrapper[4733]: I1206 05:55:22.329106 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerDied","Data":"d96af61b6408895927c8481e2b1790b5e103271e1f1d15282c3293c9c0ec7513"} Dec 06 05:55:22 crc kubenswrapper[4733]: I1206 05:55:22.373033 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" podStartSLOduration=1.370746169 podStartE2EDuration="7.373018305s" podCreationTimestamp="2025-12-06 05:55:15 +0000 UTC" firstStartedPulling="2025-12-06 05:55:15.967482276 +0000 UTC m=+699.832693388" lastFinishedPulling="2025-12-06 05:55:21.969754413 +0000 UTC m=+705.834965524" observedRunningTime="2025-12-06 05:55:22.345848181 +0000 UTC m=+706.211059293" watchObservedRunningTime="2025-12-06 05:55:22.373018305 +0000 UTC m=+706.238229416" Dec 06 05:55:23 crc kubenswrapper[4733]: I1206 05:55:23.336904 4733 generic.go:334] "Generic (PLEG): container finished" podID="b6c5bf84-f86b-4a51-bd80-0a23163dd42b" containerID="9c5611c9a59fbee24669057e31da0e6cf5aac4f7ed288d84e8d5c70f3c06391d" exitCode=0 Dec 06 05:55:23 crc kubenswrapper[4733]: I1206 05:55:23.336993 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerDied","Data":"9c5611c9a59fbee24669057e31da0e6cf5aac4f7ed288d84e8d5c70f3c06391d"} Dec 06 05:55:24 crc kubenswrapper[4733]: I1206 05:55:24.344648 4733 generic.go:334] "Generic (PLEG): container finished" podID="b6c5bf84-f86b-4a51-bd80-0a23163dd42b" containerID="f5951f2bf002ba86be7424a96819288048f811ae476f26a64d550f5f42df043d" exitCode=0 Dec 06 05:55:24 crc kubenswrapper[4733]: I1206 05:55:24.344710 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerDied","Data":"f5951f2bf002ba86be7424a96819288048f811ae476f26a64d550f5f42df043d"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357279 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"f8ac21dd38f4e85ec6629eac864beaf22363b67f2fc772c0bf7e1f0df8c5eed6"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357771 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"d1c9997a5dfb3a81512e2372a97648ccc1aebc3842500f6067ef61ce51bbea8c"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357796 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357811 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"3c6591a7bc5731b253372b3eb2232131a89073164cfa0d853827863f6758a9fd"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357822 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"e315fa32d3426291117a812898daab0a7f4d13f45be8521db0e6028b478a25d9"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357836 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"92fd21cefddf00ce915dd2059641fe40f405fef67b4bd9a287df30a05e9d0208"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.357846 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8l7xm" event={"ID":"b6c5bf84-f86b-4a51-bd80-0a23163dd42b","Type":"ContainerStarted","Data":"1ba5ace1e8cdd04cfc48d9c9af0a2b105824ec0131a534db857d8296520718fa"} Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.383652 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8l7xm" podStartSLOduration=4.112097077 podStartE2EDuration="10.38363431s" podCreationTimestamp="2025-12-06 05:55:15 +0000 UTC" firstStartedPulling="2025-12-06 05:55:15.693070682 +0000 UTC m=+699.558281793" lastFinishedPulling="2025-12-06 05:55:21.964607915 +0000 UTC m=+705.829819026" observedRunningTime="2025-12-06 05:55:25.377517158 +0000 UTC m=+709.242728269" watchObservedRunningTime="2025-12-06 05:55:25.38363431 +0000 UTC m=+709.248845421" Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.552758 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:25 crc kubenswrapper[4733]: I1206 05:55:25.584413 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:27 crc kubenswrapper[4733]: I1206 05:55:27.119629 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-l9xh2" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.290608 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.291224 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.293161 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.293557 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.295940 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8fpcq" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.297833 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.330170 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlm5\" (UniqueName: \"kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5\") pod \"openstack-operator-index-njtkx\" (UID: \"be1c1ea0-baea-4baf-a226-b1bdd25deb9c\") " pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.431869 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlm5\" (UniqueName: \"kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5\") pod \"openstack-operator-index-njtkx\" (UID: \"be1c1ea0-baea-4baf-a226-b1bdd25deb9c\") " pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.449133 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlm5\" (UniqueName: \"kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5\") pod \"openstack-operator-index-njtkx\" (UID: \"be1c1ea0-baea-4baf-a226-b1bdd25deb9c\") " pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:29 crc kubenswrapper[4733]: I1206 05:55:29.604694 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:30 crc kubenswrapper[4733]: I1206 05:55:30.050948 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:30 crc kubenswrapper[4733]: W1206 05:55:30.055926 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1c1ea0_baea_4baf_a226_b1bdd25deb9c.slice/crio-73fef7a452aaf33cc87d453ce643184a4fc0e28ed204d1a340e2b410326238b9 WatchSource:0}: Error finding container 73fef7a452aaf33cc87d453ce643184a4fc0e28ed204d1a340e2b410326238b9: Status 404 returned error can't find the container with id 73fef7a452aaf33cc87d453ce643184a4fc0e28ed204d1a340e2b410326238b9 Dec 06 05:55:30 crc kubenswrapper[4733]: I1206 05:55:30.402581 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtkx" event={"ID":"be1c1ea0-baea-4baf-a226-b1bdd25deb9c","Type":"ContainerStarted","Data":"73fef7a452aaf33cc87d453ce643184a4fc0e28ed204d1a340e2b410326238b9"} Dec 06 05:55:32 crc kubenswrapper[4733]: I1206 05:55:32.416597 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtkx" event={"ID":"be1c1ea0-baea-4baf-a226-b1bdd25deb9c","Type":"ContainerStarted","Data":"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f"} Dec 06 05:55:32 crc kubenswrapper[4733]: I1206 05:55:32.434008 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-njtkx" podStartSLOduration=2.278866023 podStartE2EDuration="3.433986605s" podCreationTimestamp="2025-12-06 05:55:29 +0000 UTC" firstStartedPulling="2025-12-06 05:55:30.05796021 +0000 UTC m=+713.923171322" lastFinishedPulling="2025-12-06 05:55:31.213080783 +0000 UTC m=+715.078291904" observedRunningTime="2025-12-06 05:55:32.430087953 +0000 UTC m=+716.295299065" watchObservedRunningTime="2025-12-06 05:55:32.433986605 +0000 UTC m=+716.299197717" Dec 06 05:55:32 crc kubenswrapper[4733]: I1206 05:55:32.673284 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.276922 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r7q28"] Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.277753 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.284630 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r7q28"] Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.388574 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtxgv\" (UniqueName: \"kubernetes.io/projected/ed9fa40d-152c-4e12-8ac4-ccf89c50ade2-kube-api-access-qtxgv\") pod \"openstack-operator-index-r7q28\" (UID: \"ed9fa40d-152c-4e12-8ac4-ccf89c50ade2\") " pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.490556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtxgv\" (UniqueName: \"kubernetes.io/projected/ed9fa40d-152c-4e12-8ac4-ccf89c50ade2-kube-api-access-qtxgv\") pod \"openstack-operator-index-r7q28\" (UID: \"ed9fa40d-152c-4e12-8ac4-ccf89c50ade2\") " pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.507679 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtxgv\" (UniqueName: \"kubernetes.io/projected/ed9fa40d-152c-4e12-8ac4-ccf89c50ade2-kube-api-access-qtxgv\") pod \"openstack-operator-index-r7q28\" (UID: \"ed9fa40d-152c-4e12-8ac4-ccf89c50ade2\") " pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.591170 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:33 crc kubenswrapper[4733]: I1206 05:55:33.952369 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r7q28"] Dec 06 05:55:34 crc kubenswrapper[4733]: I1206 05:55:34.426806 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r7q28" event={"ID":"ed9fa40d-152c-4e12-8ac4-ccf89c50ade2","Type":"ContainerStarted","Data":"c6ec93456698c2b71f192479c4b3140a8dbf21a45c5ff234b7b03b80b2362fbb"} Dec 06 05:55:34 crc kubenswrapper[4733]: I1206 05:55:34.426928 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-njtkx" podUID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" containerName="registry-server" containerID="cri-o://2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f" gracePeriod=2 Dec 06 05:55:34 crc kubenswrapper[4733]: I1206 05:55:34.737760 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:34 crc kubenswrapper[4733]: I1206 05:55:34.908804 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlm5\" (UniqueName: \"kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5\") pod \"be1c1ea0-baea-4baf-a226-b1bdd25deb9c\" (UID: \"be1c1ea0-baea-4baf-a226-b1bdd25deb9c\") " Dec 06 05:55:34 crc kubenswrapper[4733]: I1206 05:55:34.915000 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5" (OuterVolumeSpecName: "kube-api-access-qrlm5") pod "be1c1ea0-baea-4baf-a226-b1bdd25deb9c" (UID: "be1c1ea0-baea-4baf-a226-b1bdd25deb9c"). InnerVolumeSpecName "kube-api-access-qrlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.010540 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlm5\" (UniqueName: \"kubernetes.io/projected/be1c1ea0-baea-4baf-a226-b1bdd25deb9c-kube-api-access-qrlm5\") on node \"crc\" DevicePath \"\"" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.436022 4733 generic.go:334] "Generic (PLEG): container finished" podID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" containerID="2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f" exitCode=0 Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.436105 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtkx" event={"ID":"be1c1ea0-baea-4baf-a226-b1bdd25deb9c","Type":"ContainerDied","Data":"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f"} Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.436561 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtkx" event={"ID":"be1c1ea0-baea-4baf-a226-b1bdd25deb9c","Type":"ContainerDied","Data":"73fef7a452aaf33cc87d453ce643184a4fc0e28ed204d1a340e2b410326238b9"} Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.436593 4733 scope.go:117] "RemoveContainer" containerID="2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.436661 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtkx" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.437817 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r7q28" event={"ID":"ed9fa40d-152c-4e12-8ac4-ccf89c50ade2","Type":"ContainerStarted","Data":"5162285a26687b55897288f5830a0ca6accbb2a76862c15dc972bb1af7b36d49"} Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.456383 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r7q28" podStartSLOduration=1.900298041 podStartE2EDuration="2.45636366s" podCreationTimestamp="2025-12-06 05:55:33 +0000 UTC" firstStartedPulling="2025-12-06 05:55:33.962214289 +0000 UTC m=+717.827425400" lastFinishedPulling="2025-12-06 05:55:34.518279907 +0000 UTC m=+718.383491019" observedRunningTime="2025-12-06 05:55:35.454031725 +0000 UTC m=+719.319242826" watchObservedRunningTime="2025-12-06 05:55:35.45636366 +0000 UTC m=+719.321574770" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.460482 4733 scope.go:117] "RemoveContainer" containerID="2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f" Dec 06 05:55:35 crc kubenswrapper[4733]: E1206 05:55:35.460806 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f\": container with ID starting with 2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f not found: ID does not exist" containerID="2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.460851 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f"} err="failed to get container status \"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f\": rpc error: code = NotFound desc = could not find container \"2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f\": container with ID starting with 2b1b748d716df6fff04fe38a0cc0bc1222ee7fd4e2443e5392877f1e8b7b810f not found: ID does not exist" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.467410 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.469957 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-njtkx"] Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.553741 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8l7xm" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.564127 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqg7r" Dec 06 05:55:35 crc kubenswrapper[4733]: I1206 05:55:35.631378 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-vkbjg" Dec 06 05:55:36 crc kubenswrapper[4733]: I1206 05:55:36.497802 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" path="/var/lib/kubelet/pods/be1c1ea0-baea-4baf-a226-b1bdd25deb9c/volumes" Dec 06 05:55:43 crc kubenswrapper[4733]: I1206 05:55:43.592064 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:43 crc kubenswrapper[4733]: I1206 05:55:43.592577 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:43 crc kubenswrapper[4733]: I1206 05:55:43.617019 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:44 crc kubenswrapper[4733]: I1206 05:55:44.517115 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-r7q28" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.678722 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt"] Dec 06 05:55:50 crc kubenswrapper[4733]: E1206 05:55:50.679464 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" containerName="registry-server" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.679479 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" containerName="registry-server" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.679583 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1c1ea0-baea-4baf-a226-b1bdd25deb9c" containerName="registry-server" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.680329 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.682643 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-swxlg" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.688971 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt"] Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.780476 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lf6s\" (UniqueName: \"kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.780551 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.780597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.882262 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lf6s\" (UniqueName: \"kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.882378 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.882433 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.882994 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.883029 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.901352 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lf6s\" (UniqueName: \"kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:50 crc kubenswrapper[4733]: I1206 05:55:50.996451 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:51 crc kubenswrapper[4733]: I1206 05:55:51.362003 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt"] Dec 06 05:55:51 crc kubenswrapper[4733]: I1206 05:55:51.533889 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerStarted","Data":"dff89be0d1eb5295c03502c7ce66300beaad775a5cb1f4a424b19132107c1de5"} Dec 06 05:55:51 crc kubenswrapper[4733]: I1206 05:55:51.534151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerStarted","Data":"b467b5be72e10b9b5f6ae5f26b7654a9ccf11534b8ebfcd93d721f0936ecc47e"} Dec 06 05:55:52 crc kubenswrapper[4733]: I1206 05:55:52.544123 4733 generic.go:334] "Generic (PLEG): container finished" podID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerID="dff89be0d1eb5295c03502c7ce66300beaad775a5cb1f4a424b19132107c1de5" exitCode=0 Dec 06 05:55:52 crc kubenswrapper[4733]: I1206 05:55:52.544205 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerDied","Data":"dff89be0d1eb5295c03502c7ce66300beaad775a5cb1f4a424b19132107c1de5"} Dec 06 05:55:53 crc kubenswrapper[4733]: I1206 05:55:53.561069 4733 generic.go:334] "Generic (PLEG): container finished" podID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerID="4f843a7fd8f167c6483dbf9baa32183740a79424ff409f74ddacabb95d41501f" exitCode=0 Dec 06 05:55:53 crc kubenswrapper[4733]: I1206 05:55:53.561329 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerDied","Data":"4f843a7fd8f167c6483dbf9baa32183740a79424ff409f74ddacabb95d41501f"} Dec 06 05:55:54 crc kubenswrapper[4733]: I1206 05:55:54.570382 4733 generic.go:334] "Generic (PLEG): container finished" podID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerID="2e0dc5bddbbf88b94f8ad70fb3e9db4608cbb02c667ee9b1a7e5bd9b6996ee1b" exitCode=0 Dec 06 05:55:54 crc kubenswrapper[4733]: I1206 05:55:54.570489 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerDied","Data":"2e0dc5bddbbf88b94f8ad70fb3e9db4608cbb02c667ee9b1a7e5bd9b6996ee1b"} Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.792077 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.949344 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle\") pod \"f7c69fa4-d047-47d2-a147-0316949d45c5\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.949450 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util\") pod \"f7c69fa4-d047-47d2-a147-0316949d45c5\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.949540 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lf6s\" (UniqueName: \"kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s\") pod \"f7c69fa4-d047-47d2-a147-0316949d45c5\" (UID: \"f7c69fa4-d047-47d2-a147-0316949d45c5\") " Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.950223 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle" (OuterVolumeSpecName: "bundle") pod "f7c69fa4-d047-47d2-a147-0316949d45c5" (UID: "f7c69fa4-d047-47d2-a147-0316949d45c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.955792 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s" (OuterVolumeSpecName: "kube-api-access-5lf6s") pod "f7c69fa4-d047-47d2-a147-0316949d45c5" (UID: "f7c69fa4-d047-47d2-a147-0316949d45c5"). InnerVolumeSpecName "kube-api-access-5lf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:55:55 crc kubenswrapper[4733]: I1206 05:55:55.959762 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util" (OuterVolumeSpecName: "util") pod "f7c69fa4-d047-47d2-a147-0316949d45c5" (UID: "f7c69fa4-d047-47d2-a147-0316949d45c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.051243 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.051272 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7c69fa4-d047-47d2-a147-0316949d45c5-util\") on node \"crc\" DevicePath \"\"" Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.051282 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lf6s\" (UniqueName: \"kubernetes.io/projected/f7c69fa4-d047-47d2-a147-0316949d45c5-kube-api-access-5lf6s\") on node \"crc\" DevicePath \"\"" Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.584061 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" event={"ID":"f7c69fa4-d047-47d2-a147-0316949d45c5","Type":"ContainerDied","Data":"b467b5be72e10b9b5f6ae5f26b7654a9ccf11534b8ebfcd93d721f0936ecc47e"} Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.584089 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt" Dec 06 05:55:56 crc kubenswrapper[4733]: I1206 05:55:56.584104 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b467b5be72e10b9b5f6ae5f26b7654a9ccf11534b8ebfcd93d721f0936ecc47e" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.180274 4733 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.677758 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4"] Dec 06 05:56:02 crc kubenswrapper[4733]: E1206 05:56:02.677976 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="pull" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.677989 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="pull" Dec 06 05:56:02 crc kubenswrapper[4733]: E1206 05:56:02.678004 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="extract" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.678010 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="extract" Dec 06 05:56:02 crc kubenswrapper[4733]: E1206 05:56:02.678023 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="util" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.678031 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="util" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.678153 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c69fa4-d047-47d2-a147-0316949d45c5" containerName="extract" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.678565 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.680219 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-pwg48" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.745872 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4"] Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.752103 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrc2\" (UniqueName: \"kubernetes.io/projected/208b1115-af84-42b4-8425-a576457b38d2-kube-api-access-hbrc2\") pod \"openstack-operator-controller-operator-55b6fb9447-kmhl4\" (UID: \"208b1115-af84-42b4-8425-a576457b38d2\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.853865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrc2\" (UniqueName: \"kubernetes.io/projected/208b1115-af84-42b4-8425-a576457b38d2-kube-api-access-hbrc2\") pod \"openstack-operator-controller-operator-55b6fb9447-kmhl4\" (UID: \"208b1115-af84-42b4-8425-a576457b38d2\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.871075 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrc2\" (UniqueName: \"kubernetes.io/projected/208b1115-af84-42b4-8425-a576457b38d2-kube-api-access-hbrc2\") pod \"openstack-operator-controller-operator-55b6fb9447-kmhl4\" (UID: \"208b1115-af84-42b4-8425-a576457b38d2\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:02 crc kubenswrapper[4733]: I1206 05:56:02.994069 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:03 crc kubenswrapper[4733]: I1206 05:56:03.168935 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4"] Dec 06 05:56:03 crc kubenswrapper[4733]: I1206 05:56:03.631809 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" event={"ID":"208b1115-af84-42b4-8425-a576457b38d2","Type":"ContainerStarted","Data":"ce92a1b8522a2c8fa4a56fac15ab1af28df6632ce4e5e90cca81b06c1a641707"} Dec 06 05:56:09 crc kubenswrapper[4733]: I1206 05:56:09.678750 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" event={"ID":"208b1115-af84-42b4-8425-a576457b38d2","Type":"ContainerStarted","Data":"e6c2755f0e54462a780f0bcad73f3e413af027136de964663f3e299b56930e1c"} Dec 06 05:56:09 crc kubenswrapper[4733]: I1206 05:56:09.679144 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:09 crc kubenswrapper[4733]: I1206 05:56:09.703772 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" podStartSLOduration=2.209950795 podStartE2EDuration="7.703752591s" podCreationTimestamp="2025-12-06 05:56:02 +0000 UTC" firstStartedPulling="2025-12-06 05:56:03.178870809 +0000 UTC m=+747.044081920" lastFinishedPulling="2025-12-06 05:56:08.672672605 +0000 UTC m=+752.537883716" observedRunningTime="2025-12-06 05:56:09.702055461 +0000 UTC m=+753.567266571" watchObservedRunningTime="2025-12-06 05:56:09.703752591 +0000 UTC m=+753.568963703" Dec 06 05:56:22 crc kubenswrapper[4733]: I1206 05:56:22.997979 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-kmhl4" Dec 06 05:56:42 crc kubenswrapper[4733]: I1206 05:56:42.989088 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:56:42 crc kubenswrapper[4733]: I1206 05:56:42.989823 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.144969 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.146680 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.151839 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6qj7k" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.156989 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.157889 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.158421 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.162124 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.164249 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vhgnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.171445 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.173859 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.188084 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.188734 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-x4l5m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.190439 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.194789 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.201762 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-bk6qw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.233625 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.248477 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.249552 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.252728 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-t2lqj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.265486 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.272183 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.273183 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.273712 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.274203 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.275166 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qtxjk" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.275480 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.275527 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kbfrw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.282479 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.290408 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.291550 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.295417 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5wdmx" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.298372 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.300587 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.301708 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlm66\" (UniqueName: \"kubernetes.io/projected/8bbee3a7-9d6f-40d8-a5c2-eee560458e41-kube-api-access-nlm66\") pod \"cinder-operator-controller-manager-859b6ccc6-vrztj\" (UID: \"8bbee3a7-9d6f-40d8-a5c2-eee560458e41\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.301748 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8wl\" (UniqueName: \"kubernetes.io/projected/f138e9fa-e1ea-4b04-b938-0c16b8205fbe-kube-api-access-gx8wl\") pod \"barbican-operator-controller-manager-7d9dfd778-ht2mw\" (UID: \"f138e9fa-e1ea-4b04-b938-0c16b8205fbe\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.301822 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrwv9\" (UniqueName: \"kubernetes.io/projected/715c2050-78f9-4609-9575-a1c85c3b4961-kube-api-access-wrwv9\") pod \"designate-operator-controller-manager-78b4bc895b-rgm8m\" (UID: \"715c2050-78f9-4609-9575-a1c85c3b4961\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.301856 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6zv\" (UniqueName: \"kubernetes.io/projected/42977681-d2c6-4ddb-848a-751503543ed4-kube-api-access-jm6zv\") pod \"glance-operator-controller-manager-77987cd8cd-xcx5w\" (UID: \"42977681-d2c6-4ddb-848a-751503543ed4\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.308486 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.310409 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.314374 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vk86m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.314854 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.322671 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.323487 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.325195 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7ns2j" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.334362 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.337445 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.341818 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.343481 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.343491 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vtt9s" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.344525 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.346822 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.348131 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pkk8h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.353602 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.357069 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.361216 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-x6rp4" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.368922 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.381360 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.382367 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.383744 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c7b6z" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.384576 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.389259 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.400040 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.401198 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.402873 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlm66\" (UniqueName: \"kubernetes.io/projected/8bbee3a7-9d6f-40d8-a5c2-eee560458e41-kube-api-access-nlm66\") pod \"cinder-operator-controller-manager-859b6ccc6-vrztj\" (UID: \"8bbee3a7-9d6f-40d8-a5c2-eee560458e41\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.402927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct8nc\" (UniqueName: \"kubernetes.io/projected/fc37a812-3bfe-4e10-ba93-4e8fdc45361f-kube-api-access-ct8nc\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7kmwq\" (UID: \"fc37a812-3bfe-4e10-ba93-4e8fdc45361f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.402963 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8wl\" (UniqueName: \"kubernetes.io/projected/f138e9fa-e1ea-4b04-b938-0c16b8205fbe-kube-api-access-gx8wl\") pod \"barbican-operator-controller-manager-7d9dfd778-ht2mw\" (UID: \"f138e9fa-e1ea-4b04-b938-0c16b8205fbe\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403007 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzr8\" (UniqueName: \"kubernetes.io/projected/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-kube-api-access-gwzr8\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403021 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wqb7m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403033 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqlwm\" (UniqueName: \"kubernetes.io/projected/bccabc0c-9ad2-47f8-b550-8bff11a103e8-kube-api-access-jqlwm\") pod \"manila-operator-controller-manager-7c79b5df47-26bnr\" (UID: \"bccabc0c-9ad2-47f8-b550-8bff11a103e8\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403086 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403107 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrwv9\" (UniqueName: \"kubernetes.io/projected/715c2050-78f9-4609-9575-a1c85c3b4961-kube-api-access-wrwv9\") pod \"designate-operator-controller-manager-78b4bc895b-rgm8m\" (UID: \"715c2050-78f9-4609-9575-a1c85c3b4961\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403130 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2w9\" (UniqueName: \"kubernetes.io/projected/efc4f270-9152-42b0-bd6c-074697502758-kube-api-access-kl2w9\") pod \"ironic-operator-controller-manager-6c548fd776-t7chb\" (UID: \"efc4f270-9152-42b0-bd6c-074697502758\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403158 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wv8\" (UniqueName: \"kubernetes.io/projected/45064622-664d-4424-a01c-0cf85f653a67-kube-api-access-t6wv8\") pod \"keystone-operator-controller-manager-7765d96ddf-c9lc9\" (UID: \"45064622-664d-4424-a01c-0cf85f653a67\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403175 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx752\" (UniqueName: \"kubernetes.io/projected/018f851e-0c42-4bbd-bea7-7ce45a6e6ebb-kube-api-access-nx752\") pod \"heat-operator-controller-manager-5f64f6f8bb-b64rz\" (UID: \"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403203 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6zv\" (UniqueName: \"kubernetes.io/projected/42977681-d2c6-4ddb-848a-751503543ed4-kube-api-access-jm6zv\") pod \"glance-operator-controller-manager-77987cd8cd-xcx5w\" (UID: \"42977681-d2c6-4ddb-848a-751503543ed4\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.403222 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgxqm\" (UniqueName: \"kubernetes.io/projected/331a0926-1e6c-4976-b309-b20537eae22a-kube-api-access-fgxqm\") pod \"horizon-operator-controller-manager-68c6d99b8f-2lblw\" (UID: \"331a0926-1e6c-4976-b309-b20537eae22a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.405524 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.406476 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.407738 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.407889 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jfgf9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.411365 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-h9stb"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.413337 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.416756 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.418485 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mvpsv" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.424120 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6zv\" (UniqueName: \"kubernetes.io/projected/42977681-d2c6-4ddb-848a-751503543ed4-kube-api-access-jm6zv\") pod \"glance-operator-controller-manager-77987cd8cd-xcx5w\" (UID: \"42977681-d2c6-4ddb-848a-751503543ed4\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.424844 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlm66\" (UniqueName: \"kubernetes.io/projected/8bbee3a7-9d6f-40d8-a5c2-eee560458e41-kube-api-access-nlm66\") pod \"cinder-operator-controller-manager-859b6ccc6-vrztj\" (UID: \"8bbee3a7-9d6f-40d8-a5c2-eee560458e41\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.424120 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrwv9\" (UniqueName: \"kubernetes.io/projected/715c2050-78f9-4609-9575-a1c85c3b4961-kube-api-access-wrwv9\") pod \"designate-operator-controller-manager-78b4bc895b-rgm8m\" (UID: \"715c2050-78f9-4609-9575-a1c85c3b4961\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.427709 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8wl\" (UniqueName: \"kubernetes.io/projected/f138e9fa-e1ea-4b04-b938-0c16b8205fbe-kube-api-access-gx8wl\") pod \"barbican-operator-controller-manager-7d9dfd778-ht2mw\" (UID: \"f138e9fa-e1ea-4b04-b938-0c16b8205fbe\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.444447 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-h9stb"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.453909 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.462252 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.479454 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.485238 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.488342 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.490258 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-p5jcj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.492273 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.500575 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.504032 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx752\" (UniqueName: \"kubernetes.io/projected/018f851e-0c42-4bbd-bea7-7ce45a6e6ebb-kube-api-access-nx752\") pod \"heat-operator-controller-manager-5f64f6f8bb-b64rz\" (UID: \"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.504102 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgxqm\" (UniqueName: \"kubernetes.io/projected/331a0926-1e6c-4976-b309-b20537eae22a-kube-api-access-fgxqm\") pod \"horizon-operator-controller-manager-68c6d99b8f-2lblw\" (UID: \"331a0926-1e6c-4976-b309-b20537eae22a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.504138 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc5b\" (UniqueName: \"kubernetes.io/projected/1ed48735-3f0e-4777-b3ce-54a09caec1ab-kube-api-access-gqc5b\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-j9mvd\" (UID: \"1ed48735-3f0e-4777-b3ce-54a09caec1ab\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505134 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6h9f\" (UniqueName: \"kubernetes.io/projected/d46b59ae-938e-49f6-a9aa-2f78495634c3-kube-api-access-h6h9f\") pod \"placement-operator-controller-manager-78f8948974-h9stb\" (UID: \"d46b59ae-938e-49f6-a9aa-2f78495634c3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505168 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb-kube-api-access-4gkvf\") pod \"nova-operator-controller-manager-697bc559fc-kcq5s\" (UID: \"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505195 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct8nc\" (UniqueName: \"kubernetes.io/projected/fc37a812-3bfe-4e10-ba93-4e8fdc45361f-kube-api-access-ct8nc\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7kmwq\" (UID: \"fc37a812-3bfe-4e10-ba93-4e8fdc45361f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505239 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6f8m\" (UniqueName: \"kubernetes.io/projected/1bd3247c-9536-44e7-8857-c9fe8aa31383-kube-api-access-d6f8m\") pod \"octavia-operator-controller-manager-998648c74-wwdnq\" (UID: \"1bd3247c-9536-44e7-8857-c9fe8aa31383\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505282 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwzr8\" (UniqueName: \"kubernetes.io/projected/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-kube-api-access-gwzr8\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505717 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqlwm\" (UniqueName: \"kubernetes.io/projected/bccabc0c-9ad2-47f8-b550-8bff11a103e8-kube-api-access-jqlwm\") pod \"manila-operator-controller-manager-7c79b5df47-26bnr\" (UID: \"bccabc0c-9ad2-47f8-b550-8bff11a103e8\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505753 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505790 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505815 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2w9\" (UniqueName: \"kubernetes.io/projected/efc4f270-9152-42b0-bd6c-074697502758-kube-api-access-kl2w9\") pod \"ironic-operator-controller-manager-6c548fd776-t7chb\" (UID: \"efc4f270-9152-42b0-bd6c-074697502758\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505841 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85g57\" (UniqueName: \"kubernetes.io/projected/352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad-kube-api-access-85g57\") pod \"ovn-operator-controller-manager-b6456fdb6-mcr2h\" (UID: \"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505862 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wv8\" (UniqueName: \"kubernetes.io/projected/45064622-664d-4424-a01c-0cf85f653a67-kube-api-access-t6wv8\") pod \"keystone-operator-controller-manager-7765d96ddf-c9lc9\" (UID: \"45064622-664d-4424-a01c-0cf85f653a67\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.505879 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6khj\" (UniqueName: \"kubernetes.io/projected/bfb7e815-5af6-428e-bfca-d47d2a7a3022-kube-api-access-w6khj\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: E1206 05:56:49.508091 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:49 crc kubenswrapper[4733]: E1206 05:56:49.508167 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:50.00814682 +0000 UTC m=+793.873357931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.515714 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.520016 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx752\" (UniqueName: \"kubernetes.io/projected/018f851e-0c42-4bbd-bea7-7ce45a6e6ebb-kube-api-access-nx752\") pod \"heat-operator-controller-manager-5f64f6f8bb-b64rz\" (UID: \"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.521233 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqlwm\" (UniqueName: \"kubernetes.io/projected/bccabc0c-9ad2-47f8-b550-8bff11a103e8-kube-api-access-jqlwm\") pod \"manila-operator-controller-manager-7c79b5df47-26bnr\" (UID: \"bccabc0c-9ad2-47f8-b550-8bff11a103e8\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.522453 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwzr8\" (UniqueName: \"kubernetes.io/projected/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-kube-api-access-gwzr8\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.522476 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgxqm\" (UniqueName: \"kubernetes.io/projected/331a0926-1e6c-4976-b309-b20537eae22a-kube-api-access-fgxqm\") pod \"horizon-operator-controller-manager-68c6d99b8f-2lblw\" (UID: \"331a0926-1e6c-4976-b309-b20537eae22a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.526876 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wv8\" (UniqueName: \"kubernetes.io/projected/45064622-664d-4424-a01c-0cf85f653a67-kube-api-access-t6wv8\") pod \"keystone-operator-controller-manager-7765d96ddf-c9lc9\" (UID: \"45064622-664d-4424-a01c-0cf85f653a67\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.527388 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2w9\" (UniqueName: \"kubernetes.io/projected/efc4f270-9152-42b0-bd6c-074697502758-kube-api-access-kl2w9\") pod \"ironic-operator-controller-manager-6c548fd776-t7chb\" (UID: \"efc4f270-9152-42b0-bd6c-074697502758\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.556231 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct8nc\" (UniqueName: \"kubernetes.io/projected/fc37a812-3bfe-4e10-ba93-4e8fdc45361f-kube-api-access-ct8nc\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7kmwq\" (UID: \"fc37a812-3bfe-4e10-ba93-4e8fdc45361f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.564341 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.573176 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.574395 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.577583 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qd8xw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.584798 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616388 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc5b\" (UniqueName: \"kubernetes.io/projected/1ed48735-3f0e-4777-b3ce-54a09caec1ab-kube-api-access-gqc5b\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-j9mvd\" (UID: \"1ed48735-3f0e-4777-b3ce-54a09caec1ab\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616454 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvp4\" (UniqueName: \"kubernetes.io/projected/5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c-kube-api-access-bkvp4\") pod \"swift-operator-controller-manager-5f8c65bbfc-pv7dl\" (UID: \"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616478 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6h9f\" (UniqueName: \"kubernetes.io/projected/d46b59ae-938e-49f6-a9aa-2f78495634c3-kube-api-access-h6h9f\") pod \"placement-operator-controller-manager-78f8948974-h9stb\" (UID: \"d46b59ae-938e-49f6-a9aa-2f78495634c3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616516 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb-kube-api-access-4gkvf\") pod \"nova-operator-controller-manager-697bc559fc-kcq5s\" (UID: \"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616560 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6f8m\" (UniqueName: \"kubernetes.io/projected/1bd3247c-9536-44e7-8857-c9fe8aa31383-kube-api-access-d6f8m\") pod \"octavia-operator-controller-manager-998648c74-wwdnq\" (UID: \"1bd3247c-9536-44e7-8857-c9fe8aa31383\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616612 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616676 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85g57\" (UniqueName: \"kubernetes.io/projected/352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad-kube-api-access-85g57\") pod \"ovn-operator-controller-manager-b6456fdb6-mcr2h\" (UID: \"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.616696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6khj\" (UniqueName: \"kubernetes.io/projected/bfb7e815-5af6-428e-bfca-d47d2a7a3022-kube-api-access-w6khj\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: E1206 05:56:49.617919 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:49 crc kubenswrapper[4733]: E1206 05:56:49.617967 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert podName:bfb7e815-5af6-428e-bfca-d47d2a7a3022 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:50.117950921 +0000 UTC m=+793.983162032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f58g9vl" (UID: "bfb7e815-5af6-428e-bfca-d47d2a7a3022") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.619547 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.625893 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.634034 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.644479 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.650845 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6f8m\" (UniqueName: \"kubernetes.io/projected/1bd3247c-9536-44e7-8857-c9fe8aa31383-kube-api-access-d6f8m\") pod \"octavia-operator-controller-manager-998648c74-wwdnq\" (UID: \"1bd3247c-9536-44e7-8857-c9fe8aa31383\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.651182 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6h9f\" (UniqueName: \"kubernetes.io/projected/d46b59ae-938e-49f6-a9aa-2f78495634c3-kube-api-access-h6h9f\") pod \"placement-operator-controller-manager-78f8948974-h9stb\" (UID: \"d46b59ae-938e-49f6-a9aa-2f78495634c3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.664473 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.669896 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85g57\" (UniqueName: \"kubernetes.io/projected/352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad-kube-api-access-85g57\") pod \"ovn-operator-controller-manager-b6456fdb6-mcr2h\" (UID: \"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.683629 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.684693 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.685809 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb-kube-api-access-4gkvf\") pod \"nova-operator-controller-manager-697bc559fc-kcq5s\" (UID: \"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.689978 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6khj\" (UniqueName: \"kubernetes.io/projected/bfb7e815-5af6-428e-bfca-d47d2a7a3022-kube-api-access-w6khj\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.690744 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-g6chn" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.691012 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.698759 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc5b\" (UniqueName: \"kubernetes.io/projected/1ed48735-3f0e-4777-b3ce-54a09caec1ab-kube-api-access-gqc5b\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-j9mvd\" (UID: \"1ed48735-3f0e-4777-b3ce-54a09caec1ab\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.699046 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.715052 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.718211 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvp4\" (UniqueName: \"kubernetes.io/projected/5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c-kube-api-access-bkvp4\") pod \"swift-operator-controller-manager-5f8c65bbfc-pv7dl\" (UID: \"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.718277 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfrm\" (UniqueName: \"kubernetes.io/projected/eef61090-130b-4d9d-99e8-6cc4bff0b467-kube-api-access-5dfrm\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2m7gg\" (UID: \"eef61090-130b-4d9d-99e8-6cc4bff0b467\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.744145 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvp4\" (UniqueName: \"kubernetes.io/projected/5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c-kube-api-access-bkvp4\") pod \"swift-operator-controller-manager-5f8c65bbfc-pv7dl\" (UID: \"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.758049 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.782737 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.783968 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.787031 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vzfhn" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.822586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp768\" (UniqueName: \"kubernetes.io/projected/e744adbb-1e4c-4461-8892-799f8a42976f-kube-api-access-hp768\") pod \"test-operator-controller-manager-5854674fcc-nbzjs\" (UID: \"e744adbb-1e4c-4461-8892-799f8a42976f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.822886 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfrm\" (UniqueName: \"kubernetes.io/projected/eef61090-130b-4d9d-99e8-6cc4bff0b467-kube-api-access-5dfrm\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2m7gg\" (UID: \"eef61090-130b-4d9d-99e8-6cc4bff0b467\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.823909 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.845919 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfrm\" (UniqueName: \"kubernetes.io/projected/eef61090-130b-4d9d-99e8-6cc4bff0b467-kube-api-access-5dfrm\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2m7gg\" (UID: \"eef61090-130b-4d9d-99e8-6cc4bff0b467\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.889149 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.890071 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.894172 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p2tlz" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.894326 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.894874 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.902942 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.916815 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.926585 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp768\" (UniqueName: \"kubernetes.io/projected/e744adbb-1e4c-4461-8892-799f8a42976f-kube-api-access-hp768\") pod \"test-operator-controller-manager-5854674fcc-nbzjs\" (UID: \"e744adbb-1e4c-4461-8892-799f8a42976f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.926629 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg8kv\" (UniqueName: \"kubernetes.io/projected/62d11d7c-5132-4e22-9780-2ff475c07618-kube-api-access-zg8kv\") pod \"watcher-operator-controller-manager-769dc69bc-9p759\" (UID: \"62d11d7c-5132-4e22-9780-2ff475c07618\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.929255 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.960462 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.961433 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.964916 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-q9lz4" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.965933 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp768\" (UniqueName: \"kubernetes.io/projected/e744adbb-1e4c-4461-8892-799f8a42976f-kube-api-access-hp768\") pod \"test-operator-controller-manager-5854674fcc-nbzjs\" (UID: \"e744adbb-1e4c-4461-8892-799f8a42976f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.980916 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd"] Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.981717 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:56:49 crc kubenswrapper[4733]: I1206 05:56:49.983820 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.026070 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.026770 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.028921 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.029012 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9rv\" (UniqueName: \"kubernetes.io/projected/4f778c13-06e7-4b71-98b8-28e3165cdf8b-kube-api-access-zd9rv\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.029040 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg8kv\" (UniqueName: \"kubernetes.io/projected/62d11d7c-5132-4e22-9780-2ff475c07618-kube-api-access-zg8kv\") pod \"watcher-operator-controller-manager-769dc69bc-9p759\" (UID: \"62d11d7c-5132-4e22-9780-2ff475c07618\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.029248 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.029272 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.029497 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.029548 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:51.029533177 +0000 UTC m=+794.894744289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.047497 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg8kv\" (UniqueName: \"kubernetes.io/projected/62d11d7c-5132-4e22-9780-2ff475c07618-kube-api-access-zg8kv\") pod \"watcher-operator-controller-manager-769dc69bc-9p759\" (UID: \"62d11d7c-5132-4e22-9780-2ff475c07618\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.131421 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.131660 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9rv\" (UniqueName: \"kubernetes.io/projected/4f778c13-06e7-4b71-98b8-28e3165cdf8b-kube-api-access-zd9rv\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.131724 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.131772 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.131872 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2bp\" (UniqueName: \"kubernetes.io/projected/4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed-kube-api-access-kx2bp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-cqrpd\" (UID: \"4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.132365 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.132499 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert podName:bfb7e815-5af6-428e-bfca-d47d2a7a3022 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:51.132454514 +0000 UTC m=+794.997665626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f58g9vl" (UID: "bfb7e815-5af6-428e-bfca-d47d2a7a3022") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.133507 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.133574 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:50.63356352 +0000 UTC m=+794.498774632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.133902 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.133950 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:50.633942253 +0000 UTC m=+794.499153364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.142152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.155889 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.156752 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9rv\" (UniqueName: \"kubernetes.io/projected/4f778c13-06e7-4b71-98b8-28e3165cdf8b-kube-api-access-zd9rv\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.179295 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bbee3a7_9d6f_40d8_a5c2_eee560458e41.slice/crio-ace7b86d3620cdcf36042b394bfdd0090ca61ee5925107a3a06c9893253d28ce WatchSource:0}: Error finding container ace7b86d3620cdcf36042b394bfdd0090ca61ee5925107a3a06c9893253d28ce: Status 404 returned error can't find the container with id ace7b86d3620cdcf36042b394bfdd0090ca61ee5925107a3a06c9893253d28ce Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.234061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx2bp\" (UniqueName: \"kubernetes.io/projected/4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed-kube-api-access-kx2bp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-cqrpd\" (UID: \"4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.249863 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx2bp\" (UniqueName: \"kubernetes.io/projected/4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed-kube-api-access-kx2bp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-cqrpd\" (UID: \"4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.321682 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.327366 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.334059 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.337818 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefc4f270_9152_42b0_bd6c_074697502758.slice/crio-bf41de3b713b9dde7a1d5e5dd8e03627d6b5e26db0637a75ae393c7ede751a58 WatchSource:0}: Error finding container bf41de3b713b9dde7a1d5e5dd8e03627d6b5e26db0637a75ae393c7ede751a58: Status 404 returned error can't find the container with id bf41de3b713b9dde7a1d5e5dd8e03627d6b5e26db0637a75ae393c7ede751a58 Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.338432 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod715c2050_78f9_4609_9575_a1c85c3b4961.slice/crio-8512bb2b8f0d660429ccea80ae0e33df0ed24a87ed193cb0b0a2d8c450fcadef WatchSource:0}: Error finding container 8512bb2b8f0d660429ccea80ae0e33df0ed24a87ed193cb0b0a2d8c450fcadef: Status 404 returned error can't find the container with id 8512bb2b8f0d660429ccea80ae0e33df0ed24a87ed193cb0b0a2d8c450fcadef Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.344046 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.347129 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.351635 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018f851e_0c42_4bbd_bea7_7ce45a6e6ebb.slice/crio-c77a9120ed4173b227f1611bc202dae6aa8488896b5ef8dfef8e3525813bf72f WatchSource:0}: Error finding container c77a9120ed4173b227f1611bc202dae6aa8488896b5ef8dfef8e3525813bf72f: Status 404 returned error can't find the container with id c77a9120ed4173b227f1611bc202dae6aa8488896b5ef8dfef8e3525813bf72f Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.354688 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42977681_d2c6_4ddb_848a_751503543ed4.slice/crio-12c0610cfbca7787b428dc722e35d001969e88502b9a3ae07f1f2713cfdb7163 WatchSource:0}: Error finding container 12c0610cfbca7787b428dc722e35d001969e88502b9a3ae07f1f2713cfdb7163: Status 404 returned error can't find the container with id 12c0610cfbca7787b428dc722e35d001969e88502b9a3ae07f1f2713cfdb7163 Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.468293 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.478520 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.490956 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45064622_664d_4424_a01c_0cf85f653a67.slice/crio-41a707b4af7c3030ca2161f8d54cdb2c00941ee7ccc718365ea331e06206c7bf WatchSource:0}: Error finding container 41a707b4af7c3030ca2161f8d54cdb2c00941ee7ccc718365ea331e06206c7bf: Status 404 returned error can't find the container with id 41a707b4af7c3030ca2161f8d54cdb2c00941ee7ccc718365ea331e06206c7bf Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.552636 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.566130 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.581333 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46b59ae_938e_49f6_a9aa_2f78495634c3.slice/crio-7f36a64b85055fbe4f97259adde7e7d6f47e282ee5006f53897e6514fdf73429 WatchSource:0}: Error finding container 7f36a64b85055fbe4f97259adde7e7d6f47e282ee5006f53897e6514fdf73429: Status 404 returned error can't find the container with id 7f36a64b85055fbe4f97259adde7e7d6f47e282ee5006f53897e6514fdf73429 Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.584078 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-h9stb"] Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.585093 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6h9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-h9stb_openstack-operators(d46b59ae-938e-49f6-a9aa-2f78495634c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.586928 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6h9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-h9stb_openstack-operators(d46b59ae-938e-49f6-a9aa-2f78495634c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.588210 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" podUID="d46b59ae-938e-49f6-a9aa-2f78495634c3" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.591728 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.594756 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.641749 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.641804 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.642769 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.642848 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:51.642829425 +0000 UTC m=+795.508040526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.643251 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.643317 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:51.643286184 +0000 UTC m=+795.508497295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "webhook-server-cert" not found Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.670869 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.676483 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd"] Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.679239 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkvp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-pv7dl_openstack-operators(5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.679400 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s"] Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.682833 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.682884 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20c3b8a_0e57_4ba7_92f2_bf01e12bfedb.slice/crio-ce955e8cb0f59454cba5c3c9084f08678d8a52b6811a36b73d6e1e7544005a46 WatchSource:0}: Error finding container ce955e8cb0f59454cba5c3c9084f08678d8a52b6811a36b73d6e1e7544005a46: Status 404 returned error can't find the container with id ce955e8cb0f59454cba5c3c9084f08678d8a52b6811a36b73d6e1e7544005a46 Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.683007 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkvp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-pv7dl_openstack-operators(5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.684386 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" podUID="5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.687784 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4gkvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-kcq5s_openstack-operators(a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.690372 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4gkvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-kcq5s_openstack-operators(a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.691678 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" podUID="a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb" Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.693458 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef61090_130b_4d9d_99e8_6cc4bff0b467.slice/crio-1e6c518bee0359d0a2ae00185ce882c9f55bfc6f38a2778fdedc125b2346cf34 WatchSource:0}: Error finding container 1e6c518bee0359d0a2ae00185ce882c9f55bfc6f38a2778fdedc125b2346cf34: Status 404 returned error can't find the container with id 1e6c518bee0359d0a2ae00185ce882c9f55bfc6f38a2778fdedc125b2346cf34 Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.695146 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.695187 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode744adbb_1e4c_4461_8892_799f8a42976f.slice/crio-9af76539acfd92320f15fe8170ee16549c1499e5441c35b47b0a35fb47b01cb1 WatchSource:0}: Error finding container 9af76539acfd92320f15fe8170ee16549c1499e5441c35b47b0a35fb47b01cb1: Status 404 returned error can't find the container with id 9af76539acfd92320f15fe8170ee16549c1499e5441c35b47b0a35fb47b01cb1 Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.699638 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dfrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2m7gg_openstack-operators(eef61090-130b-4d9d-99e8-6cc4bff0b467): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.699976 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hp768,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nbzjs_openstack-operators(e744adbb-1e4c-4461-8892-799f8a42976f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.701838 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hp768,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nbzjs_openstack-operators(e744adbb-1e4c-4461-8892-799f8a42976f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.701958 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dfrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2m7gg_openstack-operators(eef61090-130b-4d9d-99e8-6cc4bff0b467): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.703264 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" podUID="e744adbb-1e4c-4461-8892-799f8a42976f" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.703283 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" podUID="eef61090-130b-4d9d-99e8-6cc4bff0b467" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.779369 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759"] Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.781870 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d11d7c_5132_4e22_9780_2ff475c07618.slice/crio-44c7bd51248fa6bb8b96a114ced752cc2f435278291807583f5d5c37768e16da WatchSource:0}: Error finding container 44c7bd51248fa6bb8b96a114ced752cc2f435278291807583f5d5c37768e16da: Status 404 returned error can't find the container with id 44c7bd51248fa6bb8b96a114ced752cc2f435278291807583f5d5c37768e16da Dec 06 05:56:50 crc kubenswrapper[4733]: W1206 05:56:50.783180 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2d4dbb_c7fb_46b3_8baf_fb1ac61a12ed.slice/crio-f00e26043238507cbaea8cef66186e92d4bd5ec69088ce9fd005b5fb317da440 WatchSource:0}: Error finding container f00e26043238507cbaea8cef66186e92d4bd5ec69088ce9fd005b5fb317da440: Status 404 returned error can't find the container with id f00e26043238507cbaea8cef66186e92d4bd5ec69088ce9fd005b5fb317da440 Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.783863 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd"] Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.785155 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zg8kv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-9p759_openstack-operators(62d11d7c-5132-4e22-9780-2ff475c07618): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.785406 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kx2bp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-cqrpd_openstack-operators(4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.786615 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" podUID="4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.787638 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zg8kv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-9p759_openstack-operators(62d11d7c-5132-4e22-9780-2ff475c07618): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.788915 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" podUID="62d11d7c-5132-4e22-9780-2ff475c07618" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.911952 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" event={"ID":"f138e9fa-e1ea-4b04-b938-0c16b8205fbe","Type":"ContainerStarted","Data":"3f1f612610e8724f601dbbec35d0da6019174f858a667361dfef119b62cbac96"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.913186 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" event={"ID":"fc37a812-3bfe-4e10-ba93-4e8fdc45361f","Type":"ContainerStarted","Data":"e5e68bd6d036bcde84f9f1f6640bef7491646a844f4408d4b5a1099e23b0a931"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.914390 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" event={"ID":"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb","Type":"ContainerStarted","Data":"c77a9120ed4173b227f1611bc202dae6aa8488896b5ef8dfef8e3525813bf72f"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.915281 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" event={"ID":"62d11d7c-5132-4e22-9780-2ff475c07618","Type":"ContainerStarted","Data":"44c7bd51248fa6bb8b96a114ced752cc2f435278291807583f5d5c37768e16da"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.916274 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" event={"ID":"bccabc0c-9ad2-47f8-b550-8bff11a103e8","Type":"ContainerStarted","Data":"30b2c6b1316b4603bb310f19176f26bd15184a321456568899253e08f217a030"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.917756 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" podUID="62d11d7c-5132-4e22-9780-2ff475c07618" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.917823 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" event={"ID":"eef61090-130b-4d9d-99e8-6cc4bff0b467","Type":"ContainerStarted","Data":"1e6c518bee0359d0a2ae00185ce882c9f55bfc6f38a2778fdedc125b2346cf34"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.919483 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" event={"ID":"331a0926-1e6c-4976-b309-b20537eae22a","Type":"ContainerStarted","Data":"4dcb9a4869db2db3446ce62114e48457bf59fcc2b279751d89fea94fd43e8f79"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.919607 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" podUID="eef61090-130b-4d9d-99e8-6cc4bff0b467" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.921013 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" event={"ID":"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad","Type":"ContainerStarted","Data":"3ecf313e19fd384471106a65b6fcce194f10a254c6fc093e93e7d5f1f2e6cd75"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.922617 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" event={"ID":"e744adbb-1e4c-4461-8892-799f8a42976f","Type":"ContainerStarted","Data":"9af76539acfd92320f15fe8170ee16549c1499e5441c35b47b0a35fb47b01cb1"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.924864 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" event={"ID":"8bbee3a7-9d6f-40d8-a5c2-eee560458e41","Type":"ContainerStarted","Data":"ace7b86d3620cdcf36042b394bfdd0090ca61ee5925107a3a06c9893253d28ce"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.925848 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" podUID="e744adbb-1e4c-4461-8892-799f8a42976f" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.926407 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" event={"ID":"42977681-d2c6-4ddb-848a-751503543ed4","Type":"ContainerStarted","Data":"12c0610cfbca7787b428dc722e35d001969e88502b9a3ae07f1f2713cfdb7163"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.927806 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" event={"ID":"45064622-664d-4424-a01c-0cf85f653a67","Type":"ContainerStarted","Data":"41a707b4af7c3030ca2161f8d54cdb2c00941ee7ccc718365ea331e06206c7bf"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.929011 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" event={"ID":"efc4f270-9152-42b0-bd6c-074697502758","Type":"ContainerStarted","Data":"bf41de3b713b9dde7a1d5e5dd8e03627d6b5e26db0637a75ae393c7ede751a58"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.931531 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" event={"ID":"4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed","Type":"ContainerStarted","Data":"f00e26043238507cbaea8cef66186e92d4bd5ec69088ce9fd005b5fb317da440"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.932773 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" podUID="4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.933235 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" event={"ID":"1ed48735-3f0e-4777-b3ce-54a09caec1ab","Type":"ContainerStarted","Data":"6b41eed1be65eed6ed0022325d66d0d49ff5a8b45aae7b07aa18d3d9a21d284d"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.936122 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" event={"ID":"d46b59ae-938e-49f6-a9aa-2f78495634c3","Type":"ContainerStarted","Data":"7f36a64b85055fbe4f97259adde7e7d6f47e282ee5006f53897e6514fdf73429"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.939475 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" podUID="d46b59ae-938e-49f6-a9aa-2f78495634c3" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.940101 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" event={"ID":"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb","Type":"ContainerStarted","Data":"ce955e8cb0f59454cba5c3c9084f08678d8a52b6811a36b73d6e1e7544005a46"} Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.941294 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" event={"ID":"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c","Type":"ContainerStarted","Data":"248b9e2d56ca66f0eb544950d81336b0f01cbda01dd759ba38a89a0b471b4e23"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.942368 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" podUID="a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.942569 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" event={"ID":"1bd3247c-9536-44e7-8857-c9fe8aa31383","Type":"ContainerStarted","Data":"d4b6fc116b683576bb5f35a5315e3545e93ba2873fe2d08a519e8c9d8ee630f3"} Dec 06 05:56:50 crc kubenswrapper[4733]: E1206 05:56:50.942781 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" podUID="5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c" Dec 06 05:56:50 crc kubenswrapper[4733]: I1206 05:56:50.943743 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" event={"ID":"715c2050-78f9-4609-9575-a1c85c3b4961","Type":"ContainerStarted","Data":"8512bb2b8f0d660429ccea80ae0e33df0ed24a87ed193cb0b0a2d8c450fcadef"} Dec 06 05:56:51 crc kubenswrapper[4733]: I1206 05:56:51.048096 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.048291 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.049805 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:53.049773901 +0000 UTC m=+796.914985012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: I1206 05:56:51.149854 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.150038 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.150119 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert podName:bfb7e815-5af6-428e-bfca-d47d2a7a3022 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:53.150099928 +0000 UTC m=+797.015311040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f58g9vl" (UID: "bfb7e815-5af6-428e-bfca-d47d2a7a3022") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: I1206 05:56:51.657979 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:51 crc kubenswrapper[4733]: I1206 05:56:51.658053 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.658214 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.658247 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.658271 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:53.658251478 +0000 UTC m=+797.523462589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.658384 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:53.658353049 +0000 UTC m=+797.523564159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "webhook-server-cert" not found Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.953001 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" podUID="5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.953075 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" podUID="4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.953093 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" podUID="62d11d7c-5132-4e22-9780-2ff475c07618" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.954407 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" podUID="e744adbb-1e4c-4461-8892-799f8a42976f" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.954495 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" podUID="a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.954652 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" podUID="eef61090-130b-4d9d-99e8-6cc4bff0b467" Dec 06 05:56:51 crc kubenswrapper[4733]: E1206 05:56:51.954862 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" podUID="d46b59ae-938e-49f6-a9aa-2f78495634c3" Dec 06 05:56:53 crc kubenswrapper[4733]: I1206 05:56:53.078750 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.078942 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.079015 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:57.078996607 +0000 UTC m=+800.944207718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: I1206 05:56:53.182635 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.182770 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.182833 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert podName:bfb7e815-5af6-428e-bfca-d47d2a7a3022 nodeName:}" failed. No retries permitted until 2025-12-06 05:56:57.182813188 +0000 UTC m=+801.048024299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f58g9vl" (UID: "bfb7e815-5af6-428e-bfca-d47d2a7a3022") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: I1206 05:56:53.690871 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:53 crc kubenswrapper[4733]: I1206 05:56:53.691141 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.691078 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.691255 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:57.691236449 +0000 UTC m=+801.556447559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "webhook-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.691370 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:56:53 crc kubenswrapper[4733]: E1206 05:56:53.691437 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:56:57.69142362 +0000 UTC m=+801.556634731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: I1206 05:56:57.147423 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.147963 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.148018 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:57:05.148000993 +0000 UTC m=+809.013212104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: I1206 05:56:57.249494 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.249769 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.249848 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert podName:bfb7e815-5af6-428e-bfca-d47d2a7a3022 nodeName:}" failed. No retries permitted until 2025-12-06 05:57:05.249822042 +0000 UTC m=+809.115033152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f58g9vl" (UID: "bfb7e815-5af6-428e-bfca-d47d2a7a3022") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: I1206 05:56:57.756740 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:57 crc kubenswrapper[4733]: I1206 05:56:57.756827 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.756952 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.757035 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:57:05.757013936 +0000 UTC m=+809.622225047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "webhook-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.757129 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:56:57 crc kubenswrapper[4733]: E1206 05:56:57.757274 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:57:05.75723944 +0000 UTC m=+809.622450551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:57:00 crc kubenswrapper[4733]: I1206 05:57:00.015142 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" event={"ID":"f138e9fa-e1ea-4b04-b938-0c16b8205fbe","Type":"ContainerStarted","Data":"2f8648a94dd08d59d20a0c911865ec6c82bef4372048a0e56b4df1d963fde313"} Dec 06 05:57:00 crc kubenswrapper[4733]: I1206 05:57:00.017342 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" event={"ID":"bccabc0c-9ad2-47f8-b550-8bff11a103e8","Type":"ContainerStarted","Data":"10ed0ee0c4b910e84899e62ac23d762fd012f911730a31216ffdfd99d1ab6955"} Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.098643 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nx752,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-b64rz_openstack-operators(018f851e-0c42-4bbd-bea7-7ce45a6e6ebb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.100249 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" podUID="018f851e-0c42-4bbd-bea7-7ce45a6e6ebb" Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.111853 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gqc5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-j9mvd_openstack-operators(1ed48735-3f0e-4777-b3ce-54a09caec1ab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.113449 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" podUID="1ed48735-3f0e-4777-b3ce-54a09caec1ab" Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.113754 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d6f8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-wwdnq_openstack-operators(1bd3247c-9536-44e7-8857-c9fe8aa31383): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 05:57:00 crc kubenswrapper[4733]: E1206 05:57:00.114905 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" podUID="1bd3247c-9536-44e7-8857-c9fe8aa31383" Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.043052 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" event={"ID":"45064622-664d-4424-a01c-0cf85f653a67","Type":"ContainerStarted","Data":"536d53bcd53287ccaac8e7ac3d681df743faf140841f3d8858d0642cb7c7f838"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.045533 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" event={"ID":"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad","Type":"ContainerStarted","Data":"ec773becf736fa9c300c956419a2726a57129b78e579f263352faffddc38e484"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.065163 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" event={"ID":"fc37a812-3bfe-4e10-ba93-4e8fdc45361f","Type":"ContainerStarted","Data":"19df0da31cc41fb4cdd4d3302949f400c90452fac0eb7d6b8e13d0a47db60fa0"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.082401 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" event={"ID":"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb","Type":"ContainerStarted","Data":"8f9b95b708aafdfc24a24a08b2e6bc093ad205b7cb495090f0b487bb066594e6"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.083167 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:57:01 crc kubenswrapper[4733]: E1206 05:57:01.084349 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" podUID="018f851e-0c42-4bbd-bea7-7ce45a6e6ebb" Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.096473 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" event={"ID":"efc4f270-9152-42b0-bd6c-074697502758","Type":"ContainerStarted","Data":"57d549ade205cebce81e949bf471ec07eed0ef9216e8c9ed9fbe0d5bfcf5d8ec"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.099679 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" event={"ID":"8bbee3a7-9d6f-40d8-a5c2-eee560458e41","Type":"ContainerStarted","Data":"eb3f74c46ac201c6bb7f6939ea729e1f3f6797604e8510a83d4a84bb39134094"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.108938 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" event={"ID":"42977681-d2c6-4ddb-848a-751503543ed4","Type":"ContainerStarted","Data":"a5fc65aeed720613de41e3884c58577d129d9ba6584544528532b43bf2d9ec3f"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.116495 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" event={"ID":"331a0926-1e6c-4976-b309-b20537eae22a","Type":"ContainerStarted","Data":"a3dea85ba1f1af528546211e568e46781b81a58951d7c014ccd25311acd4898a"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.126293 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" event={"ID":"1bd3247c-9536-44e7-8857-c9fe8aa31383","Type":"ContainerStarted","Data":"2c1fe9e6fdf61e83f26e212bd1be2d2afb6ca922b06811f187d4dd242cb0fc9b"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.126518 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:57:01 crc kubenswrapper[4733]: E1206 05:57:01.127690 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" podUID="1bd3247c-9536-44e7-8857-c9fe8aa31383" Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.134151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" event={"ID":"715c2050-78f9-4609-9575-a1c85c3b4961","Type":"ContainerStarted","Data":"142df75d25ef413ff2fed86313b46dfc123138c492e758d5f5880ab1c84f863f"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.137217 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" event={"ID":"1ed48735-3f0e-4777-b3ce-54a09caec1ab","Type":"ContainerStarted","Data":"9f3363b6b5968e4b3e8e2af4e149ddf9121784789db5dbb6701ebb6bb1c0efe4"} Dec 06 05:57:01 crc kubenswrapper[4733]: I1206 05:57:01.137987 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:57:01 crc kubenswrapper[4733]: E1206 05:57:01.138470 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" podUID="1ed48735-3f0e-4777-b3ce-54a09caec1ab" Dec 06 05:57:02 crc kubenswrapper[4733]: E1206 05:57:02.144959 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" podUID="018f851e-0c42-4bbd-bea7-7ce45a6e6ebb" Dec 06 05:57:02 crc kubenswrapper[4733]: E1206 05:57:02.145346 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" podUID="1bd3247c-9536-44e7-8857-c9fe8aa31383" Dec 06 05:57:02 crc kubenswrapper[4733]: E1206 05:57:02.147063 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" podUID="1ed48735-3f0e-4777-b3ce-54a09caec1ab" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.165870 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" event={"ID":"715c2050-78f9-4609-9575-a1c85c3b4961","Type":"ContainerStarted","Data":"05e6fb2a2532881bdee7c07b77cb4eb13a03a27c500063755abf021af9442269"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.167801 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.168871 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" event={"ID":"f138e9fa-e1ea-4b04-b938-0c16b8205fbe","Type":"ContainerStarted","Data":"6965ac8100dd6a65b6139a1453a456af20e9b91ba6b3f92daf88f73bfd790f86"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.169291 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.185917 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" event={"ID":"efc4f270-9152-42b0-bd6c-074697502758","Type":"ContainerStarted","Data":"f92efcf448ff9444a93732e401a85bb46fc0e1d4c8d50cf2c493b2057df6aba6"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.186912 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.187542 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" podStartSLOduration=1.864180995 podStartE2EDuration="14.18752559s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.348228335 +0000 UTC m=+794.213439446" lastFinishedPulling="2025-12-06 05:57:02.671572931 +0000 UTC m=+806.536784041" observedRunningTime="2025-12-06 05:57:03.185715446 +0000 UTC m=+807.050926557" watchObservedRunningTime="2025-12-06 05:57:03.18752559 +0000 UTC m=+807.052736701" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.191989 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" event={"ID":"8bbee3a7-9d6f-40d8-a5c2-eee560458e41","Type":"ContainerStarted","Data":"c3fa44593caab2c9016a39b35dca7b12529e567fc8844d4433df72f52ad74241"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.192215 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.193957 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" event={"ID":"42977681-d2c6-4ddb-848a-751503543ed4","Type":"ContainerStarted","Data":"0b70753880865683dc39558a8690592c05514469800b8abc578cb96744c27cca"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.195063 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" event={"ID":"45064622-664d-4424-a01c-0cf85f653a67","Type":"ContainerStarted","Data":"0ddcff15cae6495763913d9804dec1ace6ab046c24759a094d414e6381b0cdc7"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.195224 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.196460 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" event={"ID":"352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad","Type":"ContainerStarted","Data":"d78b0963c3449360e496e6af30e7dfda29c15b576141f7a96b65e9c97b1cf1b4"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.196584 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.198953 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" event={"ID":"fc37a812-3bfe-4e10-ba93-4e8fdc45361f","Type":"ContainerStarted","Data":"9dc2ed100c6f611edf48508b50b060874df1038dc82e0f9e4783fcb1b3d3d9c2"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.199019 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.202825 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" event={"ID":"bccabc0c-9ad2-47f8-b550-8bff11a103e8","Type":"ContainerStarted","Data":"72a3054243033aed57f04eeee1930951678215f910b357610237ad94bb1f7f58"} Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.203277 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.203749 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" podStartSLOduration=1.523446794 podStartE2EDuration="14.20373956s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.067826576 +0000 UTC m=+793.933037687" lastFinishedPulling="2025-12-06 05:57:02.748119342 +0000 UTC m=+806.613330453" observedRunningTime="2025-12-06 05:57:03.202522872 +0000 UTC m=+807.067733982" watchObservedRunningTime="2025-12-06 05:57:03.20373956 +0000 UTC m=+807.068950670" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.226044 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" podStartSLOduration=1.698429679 podStartE2EDuration="14.22603252s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.348085747 +0000 UTC m=+794.213296858" lastFinishedPulling="2025-12-06 05:57:02.875688588 +0000 UTC m=+806.740899699" observedRunningTime="2025-12-06 05:57:03.223908647 +0000 UTC m=+807.089119757" watchObservedRunningTime="2025-12-06 05:57:03.22603252 +0000 UTC m=+807.091243630" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.254803 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" podStartSLOduration=2.015922806 podStartE2EDuration="14.25478659s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.49310676 +0000 UTC m=+794.358317872" lastFinishedPulling="2025-12-06 05:57:02.731970515 +0000 UTC m=+806.597181656" observedRunningTime="2025-12-06 05:57:03.24540025 +0000 UTC m=+807.110611361" watchObservedRunningTime="2025-12-06 05:57:03.25478659 +0000 UTC m=+807.119997701" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.266998 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" podStartSLOduration=1.9252097460000002 podStartE2EDuration="14.266965894s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.575232263 +0000 UTC m=+794.440443373" lastFinishedPulling="2025-12-06 05:57:02.91698841 +0000 UTC m=+806.782199521" observedRunningTime="2025-12-06 05:57:03.266144037 +0000 UTC m=+807.131355158" watchObservedRunningTime="2025-12-06 05:57:03.266965894 +0000 UTC m=+807.132177004" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.284279 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" podStartSLOduration=1.8192116619999998 podStartE2EDuration="14.284261476s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.182655094 +0000 UTC m=+794.047866205" lastFinishedPulling="2025-12-06 05:57:02.647704908 +0000 UTC m=+806.512916019" observedRunningTime="2025-12-06 05:57:03.280026623 +0000 UTC m=+807.145237734" watchObservedRunningTime="2025-12-06 05:57:03.284261476 +0000 UTC m=+807.149472587" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.293082 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" podStartSLOduration=2.09712912 podStartE2EDuration="14.293074618s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.576765736 +0000 UTC m=+794.441976838" lastFinishedPulling="2025-12-06 05:57:02.772711225 +0000 UTC m=+806.637922336" observedRunningTime="2025-12-06 05:57:03.291651684 +0000 UTC m=+807.156862795" watchObservedRunningTime="2025-12-06 05:57:03.293074618 +0000 UTC m=+807.158285729" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.305455 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" podStartSLOduration=2.019107305 podStartE2EDuration="14.305447095s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.358725775 +0000 UTC m=+794.223936885" lastFinishedPulling="2025-12-06 05:57:02.645065565 +0000 UTC m=+806.510276675" observedRunningTime="2025-12-06 05:57:03.30434428 +0000 UTC m=+807.169555392" watchObservedRunningTime="2025-12-06 05:57:03.305447095 +0000 UTC m=+807.170658206" Dec 06 05:57:03 crc kubenswrapper[4733]: I1206 05:57:03.322826 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" podStartSLOduration=2.240100959 podStartE2EDuration="14.322818049s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.577232173 +0000 UTC m=+794.442443284" lastFinishedPulling="2025-12-06 05:57:02.659949263 +0000 UTC m=+806.525160374" observedRunningTime="2025-12-06 05:57:03.318417295 +0000 UTC m=+807.183628406" watchObservedRunningTime="2025-12-06 05:57:03.322818049 +0000 UTC m=+807.188029160" Dec 06 05:57:04 crc kubenswrapper[4733]: I1206 05:57:04.216183 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" event={"ID":"331a0926-1e6c-4976-b309-b20537eae22a","Type":"ContainerStarted","Data":"7ee44c5c90e27d15d44fcd73bdacb0078cc40ef5108ad6e574880b96897845be"} Dec 06 05:57:04 crc kubenswrapper[4733]: I1206 05:57:04.218503 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:57:04 crc kubenswrapper[4733]: I1206 05:57:04.219266 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ht2mw" Dec 06 05:57:04 crc kubenswrapper[4733]: I1206 05:57:04.219712 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-26bnr" Dec 06 05:57:04 crc kubenswrapper[4733]: I1206 05:57:04.242063 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" podStartSLOduration=2.71175052 podStartE2EDuration="15.242046055s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.574829695 +0000 UTC m=+794.440040807" lastFinishedPulling="2025-12-06 05:57:03.105125231 +0000 UTC m=+806.970336342" observedRunningTime="2025-12-06 05:57:04.234090315 +0000 UTC m=+808.099301446" watchObservedRunningTime="2025-12-06 05:57:04.242046055 +0000 UTC m=+808.107257166" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.203727 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:05 crc kubenswrapper[4733]: E1206 05:57:05.203907 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 05:57:05 crc kubenswrapper[4733]: E1206 05:57:05.204234 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert podName:0d1ec2a9-eb8b-48b1-a823-129b8cc68129 nodeName:}" failed. No retries permitted until 2025-12-06 05:57:21.20421065 +0000 UTC m=+825.069421760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert") pod "infra-operator-controller-manager-57548d458d-hcfrz" (UID: "0d1ec2a9-eb8b-48b1-a823-129b8cc68129") : secret "infra-operator-webhook-server-cert" not found Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.222882 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.224445 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xcx5w" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.224830 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2lblw" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.306028 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.312987 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfb7e815-5af6-428e-bfca-d47d2a7a3022-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f58g9vl\" (UID: \"bfb7e815-5af6-428e-bfca-d47d2a7a3022\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.323522 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.816752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.816970 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:05 crc kubenswrapper[4733]: E1206 05:57:05.817213 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 05:57:05 crc kubenswrapper[4733]: E1206 05:57:05.817316 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs podName:4f778c13-06e7-4b71-98b8-28e3165cdf8b nodeName:}" failed. No retries permitted until 2025-12-06 05:57:21.817278329 +0000 UTC m=+825.682489439 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-ckqkj" (UID: "4f778c13-06e7-4b71-98b8-28e3165cdf8b") : secret "metrics-server-cert" not found Dec 06 05:57:05 crc kubenswrapper[4733]: I1206 05:57:05.830074 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.065260 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl"] Dec 06 05:57:06 crc kubenswrapper[4733]: W1206 05:57:06.069860 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb7e815_5af6_428e_bfca_d47d2a7a3022.slice/crio-1f7fe2ceb2cd56d9f028823baf3f84d55bcd1d01c10c66029f0103465ccaf082 WatchSource:0}: Error finding container 1f7fe2ceb2cd56d9f028823baf3f84d55bcd1d01c10c66029f0103465ccaf082: Status 404 returned error can't find the container with id 1f7fe2ceb2cd56d9f028823baf3f84d55bcd1d01c10c66029f0103465ccaf082 Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.232220 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" event={"ID":"eef61090-130b-4d9d-99e8-6cc4bff0b467","Type":"ContainerStarted","Data":"d14f1e2c2203904fcb618d19d5c4779ecc264a40397953cf1b7ac9ca60859cf6"} Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.232278 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" event={"ID":"eef61090-130b-4d9d-99e8-6cc4bff0b467","Type":"ContainerStarted","Data":"d4cdb299b22a995d7fd4c4d379f8ecb58391114d283360acdbed423486273217"} Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.232471 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.234630 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" event={"ID":"bfb7e815-5af6-428e-bfca-d47d2a7a3022","Type":"ContainerStarted","Data":"1f7fe2ceb2cd56d9f028823baf3f84d55bcd1d01c10c66029f0103465ccaf082"} Dec 06 05:57:06 crc kubenswrapper[4733]: I1206 05:57:06.247707 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" podStartSLOduration=2.240758145 podStartE2EDuration="17.247690282s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.699365268 +0000 UTC m=+794.564576379" lastFinishedPulling="2025-12-06 05:57:05.706297405 +0000 UTC m=+809.571508516" observedRunningTime="2025-12-06 05:57:06.246645398 +0000 UTC m=+810.111856509" watchObservedRunningTime="2025-12-06 05:57:06.247690282 +0000 UTC m=+810.112901393" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.485585 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-vrztj" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.504488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-rgm8m" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.574161 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.631551 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-t7chb" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.636762 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-c9lc9" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.683367 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7kmwq" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.704799 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.720664 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mcr2h" Dec 06 05:57:09 crc kubenswrapper[4733]: I1206 05:57:09.985646 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" Dec 06 05:57:12 crc kubenswrapper[4733]: I1206 05:57:12.989091 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:57:12 crc kubenswrapper[4733]: I1206 05:57:12.989138 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.303191 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" event={"ID":"d46b59ae-938e-49f6-a9aa-2f78495634c3","Type":"ContainerStarted","Data":"a2104a504d0657269f9dc2d2102f69454be25ba81c23cb06e8a1df3b25027f8d"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.303239 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" event={"ID":"d46b59ae-938e-49f6-a9aa-2f78495634c3","Type":"ContainerStarted","Data":"2fe2d73661c4a81da9bed1db0503657eadffff292f3d49d0624461ba8bff4b1c"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.304045 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.305758 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" event={"ID":"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c","Type":"ContainerStarted","Data":"04d8864e29db4f97481a3c03c91fda055635906c4b470fdf34511140ed255b95"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.305782 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" event={"ID":"5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c","Type":"ContainerStarted","Data":"594bf58f4968ac96a50b17405cf94333ed72f40a49d0ac137c645526aed94a55"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.306103 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.308067 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" event={"ID":"62d11d7c-5132-4e22-9780-2ff475c07618","Type":"ContainerStarted","Data":"46af0d5a608f2f2f590fd764c92e254c9d53dd5ed24faa97e32ebcba219759ae"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.308091 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" event={"ID":"62d11d7c-5132-4e22-9780-2ff475c07618","Type":"ContainerStarted","Data":"95928fe28423345e266680fc33234024ee58c4337143c8cba6eee85b04360293"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.308257 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.309669 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" event={"ID":"e744adbb-1e4c-4461-8892-799f8a42976f","Type":"ContainerStarted","Data":"8a62a4f12c08f169b8d096effc5339c94af24d50d889526f1b8d290671786311"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.309724 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" event={"ID":"e744adbb-1e4c-4461-8892-799f8a42976f","Type":"ContainerStarted","Data":"eccd70dcb45f65e6887ddde78100b2f27ee35340589230e4d5648d5b063de35e"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.309959 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.311251 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" event={"ID":"1ed48735-3f0e-4777-b3ce-54a09caec1ab","Type":"ContainerStarted","Data":"742a9af11c049c490b0e0a56f17e6cee9b4d7510d296fc060c422256e2b3d2b9"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.312554 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" event={"ID":"4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed","Type":"ContainerStarted","Data":"6620f71f3195b32d0e5b83f25e96591ff3ddaff14d3115d3b783bd037f7e746c"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.314410 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" event={"ID":"bfb7e815-5af6-428e-bfca-d47d2a7a3022","Type":"ContainerStarted","Data":"0daaabeffff6f9ae25b0acc65d6af21a4646f6ba40134a0496b085a56ba98d7d"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.314439 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" event={"ID":"bfb7e815-5af6-428e-bfca-d47d2a7a3022","Type":"ContainerStarted","Data":"f1ebd5175d1b65f70c78c86a16a343941e25eba8e88bac8e01506f3b22060359"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.314544 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.315931 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" event={"ID":"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb","Type":"ContainerStarted","Data":"fc95d82686a4016941e685e470727d9ab0c5b9af5ee7400413dcdedcace4c6ce"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.315957 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" event={"ID":"a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb","Type":"ContainerStarted","Data":"21806ceb84a0b562a9854016d258fead402bea8b642434b4ff24e1f6ddd4b91d"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.316296 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.317687 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" event={"ID":"1bd3247c-9536-44e7-8857-c9fe8aa31383","Type":"ContainerStarted","Data":"e2694af9caa131d62019d5e621e0e1f07a513d709a0a30c0fb5b03e2169f4716"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.319374 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" event={"ID":"018f851e-0c42-4bbd-bea7-7ce45a6e6ebb","Type":"ContainerStarted","Data":"5412d3854bd761440afbd9542ec4529812e479deab3f1ad97a5064b995b47a82"} Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.330795 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" podStartSLOduration=2.345443361 podStartE2EDuration="24.330784972s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.584895765 +0000 UTC m=+794.450106876" lastFinishedPulling="2025-12-06 05:57:12.570237386 +0000 UTC m=+816.435448487" observedRunningTime="2025-12-06 05:57:13.325238413 +0000 UTC m=+817.190449523" watchObservedRunningTime="2025-12-06 05:57:13.330784972 +0000 UTC m=+817.195996083" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.350178 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-cqrpd" podStartSLOduration=4.016674363 podStartE2EDuration="24.350169002s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.785209783 +0000 UTC m=+794.650420895" lastFinishedPulling="2025-12-06 05:57:11.118704423 +0000 UTC m=+814.983915534" observedRunningTime="2025-12-06 05:57:13.348852837 +0000 UTC m=+817.214063948" watchObservedRunningTime="2025-12-06 05:57:13.350169002 +0000 UTC m=+817.215380114" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.374636 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" podStartSLOduration=4.520276607 podStartE2EDuration="24.37462457s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.785009698 +0000 UTC m=+794.650220808" lastFinishedPulling="2025-12-06 05:57:10.63935766 +0000 UTC m=+814.504568771" observedRunningTime="2025-12-06 05:57:13.369095963 +0000 UTC m=+817.234307074" watchObservedRunningTime="2025-12-06 05:57:13.37462457 +0000 UTC m=+817.239835681" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.383630 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-b64rz" podStartSLOduration=15.008445454 podStartE2EDuration="24.38361724s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.353714602 +0000 UTC m=+794.218925713" lastFinishedPulling="2025-12-06 05:56:59.728886388 +0000 UTC m=+803.594097499" observedRunningTime="2025-12-06 05:57:13.382784564 +0000 UTC m=+817.247995675" watchObservedRunningTime="2025-12-06 05:57:13.38361724 +0000 UTC m=+817.248828351" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.397120 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wwdnq" podStartSLOduration=15.176078519 podStartE2EDuration="24.397109001s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.48182711 +0000 UTC m=+794.347038222" lastFinishedPulling="2025-12-06 05:56:59.702857593 +0000 UTC m=+803.568068704" observedRunningTime="2025-12-06 05:57:13.392918881 +0000 UTC m=+817.258129992" watchObservedRunningTime="2025-12-06 05:57:13.397109001 +0000 UTC m=+817.262320111" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.412072 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" podStartSLOduration=2.519470559 podStartE2EDuration="24.412062029s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.679010813 +0000 UTC m=+794.544221923" lastFinishedPulling="2025-12-06 05:57:12.571602282 +0000 UTC m=+816.436813393" observedRunningTime="2025-12-06 05:57:13.406969322 +0000 UTC m=+817.272180433" watchObservedRunningTime="2025-12-06 05:57:13.412062029 +0000 UTC m=+817.277273140" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.430980 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" podStartSLOduration=2.520533166 podStartE2EDuration="24.430969643s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.68647537 +0000 UTC m=+794.551686470" lastFinishedPulling="2025-12-06 05:57:12.596911836 +0000 UTC m=+816.462122947" observedRunningTime="2025-12-06 05:57:13.430528013 +0000 UTC m=+817.295739124" watchObservedRunningTime="2025-12-06 05:57:13.430969643 +0000 UTC m=+817.296180755" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.468896 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" podStartSLOduration=17.976408535 podStartE2EDuration="24.468875773s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:57:06.072647665 +0000 UTC m=+809.937858776" lastFinishedPulling="2025-12-06 05:57:12.565114903 +0000 UTC m=+816.430326014" observedRunningTime="2025-12-06 05:57:13.46719862 +0000 UTC m=+817.332409731" watchObservedRunningTime="2025-12-06 05:57:13.468875773 +0000 UTC m=+817.334086884" Dec 06 05:57:13 crc kubenswrapper[4733]: I1206 05:57:13.499471 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-j9mvd" podStartSLOduration=15.431605202 podStartE2EDuration="24.499457201s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.678577087 +0000 UTC m=+794.543788198" lastFinishedPulling="2025-12-06 05:56:59.746429087 +0000 UTC m=+803.611640197" observedRunningTime="2025-12-06 05:57:13.49491008 +0000 UTC m=+817.360121191" watchObservedRunningTime="2025-12-06 05:57:13.499457201 +0000 UTC m=+817.364668312" Dec 06 05:57:19 crc kubenswrapper[4733]: I1206 05:57:19.761099 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-h9stb" Dec 06 05:57:19 crc kubenswrapper[4733]: I1206 05:57:19.777903 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" podStartSLOduration=10.359599248 podStartE2EDuration="30.777885122s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:56:50.699889654 +0000 UTC m=+794.565100765" lastFinishedPulling="2025-12-06 05:57:11.118175528 +0000 UTC m=+814.983386639" observedRunningTime="2025-12-06 05:57:13.520583477 +0000 UTC m=+817.385794588" watchObservedRunningTime="2025-12-06 05:57:19.777885122 +0000 UTC m=+823.643096233" Dec 06 05:57:19 crc kubenswrapper[4733]: I1206 05:57:19.920134 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-pv7dl" Dec 06 05:57:19 crc kubenswrapper[4733]: I1206 05:57:19.932449 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2m7gg" Dec 06 05:57:19 crc kubenswrapper[4733]: I1206 05:57:19.986139 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kcq5s" Dec 06 05:57:20 crc kubenswrapper[4733]: I1206 05:57:20.030105 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nbzjs" Dec 06 05:57:20 crc kubenswrapper[4733]: I1206 05:57:20.146047 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-9p759" Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.228753 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.234867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d1ec2a9-eb8b-48b1-a823-129b8cc68129-cert\") pod \"infra-operator-controller-manager-57548d458d-hcfrz\" (UID: \"0d1ec2a9-eb8b-48b1-a823-129b8cc68129\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.412407 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.776497 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz"] Dec 06 05:57:21 crc kubenswrapper[4733]: W1206 05:57:21.787398 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1ec2a9_eb8b_48b1_a823_129b8cc68129.slice/crio-781e788054e51e86322f96d1dd9db0d29f2ce6dcefec42c02da9907b46269fe4 WatchSource:0}: Error finding container 781e788054e51e86322f96d1dd9db0d29f2ce6dcefec42c02da9907b46269fe4: Status 404 returned error can't find the container with id 781e788054e51e86322f96d1dd9db0d29f2ce6dcefec42c02da9907b46269fe4 Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.836894 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:21 crc kubenswrapper[4733]: I1206 05:57:21.841491 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f778c13-06e7-4b71-98b8-28e3165cdf8b-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-ckqkj\" (UID: \"4f778c13-06e7-4b71-98b8-28e3165cdf8b\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:22 crc kubenswrapper[4733]: I1206 05:57:22.103055 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:22 crc kubenswrapper[4733]: I1206 05:57:22.401474 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" event={"ID":"0d1ec2a9-eb8b-48b1-a823-129b8cc68129","Type":"ContainerStarted","Data":"781e788054e51e86322f96d1dd9db0d29f2ce6dcefec42c02da9907b46269fe4"} Dec 06 05:57:22 crc kubenswrapper[4733]: I1206 05:57:22.481479 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj"] Dec 06 05:57:22 crc kubenswrapper[4733]: W1206 05:57:22.487772 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f778c13_06e7_4b71_98b8_28e3165cdf8b.slice/crio-08bcbcd38f6f34fea9e7c44ac55b226df3e987f9a6889817028fed9f5b22477b WatchSource:0}: Error finding container 08bcbcd38f6f34fea9e7c44ac55b226df3e987f9a6889817028fed9f5b22477b: Status 404 returned error can't find the container with id 08bcbcd38f6f34fea9e7c44ac55b226df3e987f9a6889817028fed9f5b22477b Dec 06 05:57:23 crc kubenswrapper[4733]: I1206 05:57:23.408855 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" event={"ID":"4f778c13-06e7-4b71-98b8-28e3165cdf8b","Type":"ContainerStarted","Data":"08bcbcd38f6f34fea9e7c44ac55b226df3e987f9a6889817028fed9f5b22477b"} Dec 06 05:57:24 crc kubenswrapper[4733]: I1206 05:57:24.421775 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" event={"ID":"4f778c13-06e7-4b71-98b8-28e3165cdf8b","Type":"ContainerStarted","Data":"071253d6167292e59d8715e23c050a015fabaf3b20c275ea3c18eb62c8f1082e"} Dec 06 05:57:24 crc kubenswrapper[4733]: I1206 05:57:24.421956 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:24 crc kubenswrapper[4733]: I1206 05:57:24.444100 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" podStartSLOduration=35.44408968 podStartE2EDuration="35.44408968s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:57:24.443894874 +0000 UTC m=+828.309105985" watchObservedRunningTime="2025-12-06 05:57:24.44408968 +0000 UTC m=+828.309300792" Dec 06 05:57:25 crc kubenswrapper[4733]: I1206 05:57:25.329516 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f58g9vl" Dec 06 05:57:26 crc kubenswrapper[4733]: I1206 05:57:26.436460 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" event={"ID":"0d1ec2a9-eb8b-48b1-a823-129b8cc68129","Type":"ContainerStarted","Data":"9810f4975ed9fc283d3c88dab3d1a54211b930a665307f4de88426b4b32be363"} Dec 06 05:57:26 crc kubenswrapper[4733]: I1206 05:57:26.436735 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" event={"ID":"0d1ec2a9-eb8b-48b1-a823-129b8cc68129","Type":"ContainerStarted","Data":"3763e61036b6db45b4861a7efb60be3cc5641101b62e3d9616e4ffad31c1d78e"} Dec 06 05:57:26 crc kubenswrapper[4733]: I1206 05:57:26.436988 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:26 crc kubenswrapper[4733]: I1206 05:57:26.454379 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" podStartSLOduration=33.049886268 podStartE2EDuration="37.454352744s" podCreationTimestamp="2025-12-06 05:56:49 +0000 UTC" firstStartedPulling="2025-12-06 05:57:21.789352892 +0000 UTC m=+825.654564002" lastFinishedPulling="2025-12-06 05:57:26.193819366 +0000 UTC m=+830.059030478" observedRunningTime="2025-12-06 05:57:26.448860147 +0000 UTC m=+830.314071257" watchObservedRunningTime="2025-12-06 05:57:26.454352744 +0000 UTC m=+830.319563856" Dec 06 05:57:31 crc kubenswrapper[4733]: I1206 05:57:31.417414 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hcfrz" Dec 06 05:57:32 crc kubenswrapper[4733]: I1206 05:57:32.110890 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-ckqkj" Dec 06 05:57:42 crc kubenswrapper[4733]: I1206 05:57:42.989651 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 05:57:42 crc kubenswrapper[4733]: I1206 05:57:42.990401 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 05:57:42 crc kubenswrapper[4733]: I1206 05:57:42.990462 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 05:57:42 crc kubenswrapper[4733]: I1206 05:57:42.991065 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 05:57:42 crc kubenswrapper[4733]: I1206 05:57:42.991131 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0" gracePeriod=600 Dec 06 05:57:43 crc kubenswrapper[4733]: I1206 05:57:43.573545 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0" exitCode=0 Dec 06 05:57:43 crc kubenswrapper[4733]: I1206 05:57:43.573798 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0"} Dec 06 05:57:43 crc kubenswrapper[4733]: I1206 05:57:43.573846 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f"} Dec 06 05:57:43 crc kubenswrapper[4733]: I1206 05:57:43.573875 4733 scope.go:117] "RemoveContainer" containerID="1947fc402b33dfad60aaf16335ae0cdb84ceaf24cd429e84ae81d03765f6da10" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.741333 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.742824 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.744354 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.745431 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.745751 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.745828 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k74rk" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.753529 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.788919 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.790080 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.792136 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.793376 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.793437 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.793464 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsz8\" (UniqueName: \"kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.793556 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.793638 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9t6c\" (UniqueName: \"kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.808181 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.894702 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsz8\" (UniqueName: \"kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.895107 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.895166 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9t6c\" (UniqueName: \"kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.895261 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.895342 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.896090 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.896109 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.896171 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.912506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsz8\" (UniqueName: \"kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8\") pod \"dnsmasq-dns-94b4f9f45-frp6w\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:48 crc kubenswrapper[4733]: I1206 05:57:48.914419 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9t6c\" (UniqueName: \"kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c\") pod \"dnsmasq-dns-6947456757-nnjdx\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.063853 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.101471 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.460959 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.469359 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 05:57:49 crc kubenswrapper[4733]: W1206 05:57:49.518029 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbab357f_c31f_4dff_9255_f19667d52997.slice/crio-7f1b8e70dbe1b25a967b3f50c913a1e43ca90c0f87e09ac215ed399c38d1e731 WatchSource:0}: Error finding container 7f1b8e70dbe1b25a967b3f50c913a1e43ca90c0f87e09ac215ed399c38d1e731: Status 404 returned error can't find the container with id 7f1b8e70dbe1b25a967b3f50c913a1e43ca90c0f87e09ac215ed399c38d1e731 Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.519686 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.615770 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6947456757-nnjdx" event={"ID":"cbab357f-c31f-4dff-9255-f19667d52997","Type":"ContainerStarted","Data":"7f1b8e70dbe1b25a967b3f50c913a1e43ca90c0f87e09ac215ed399c38d1e731"} Dec 06 05:57:49 crc kubenswrapper[4733]: I1206 05:57:49.616968 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" event={"ID":"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99","Type":"ContainerStarted","Data":"43c4c70d2399307d560e754860dfd9a501df083dd8c537db72428e4ddf403e6c"} Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.866628 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.888271 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.889760 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.899121 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.953176 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.953241 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:51 crc kubenswrapper[4733]: I1206 05:57:51.953437 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2rw\" (UniqueName: \"kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.054866 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2rw\" (UniqueName: \"kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.055021 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.055084 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.056360 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.056562 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.103065 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2rw\" (UniqueName: \"kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw\") pod \"dnsmasq-dns-55dc666865-rp6cx\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.187873 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.210841 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.212741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.214517 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.220683 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.262994 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.263103 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v964g\" (UniqueName: \"kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.263193 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.364736 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.364816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.365987 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.366166 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.366556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v964g\" (UniqueName: \"kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.388167 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v964g\" (UniqueName: \"kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g\") pod \"dnsmasq-dns-5d9886d5bf-qnvxm\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.538046 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.693488 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:57:52 crc kubenswrapper[4733]: I1206 05:57:52.947118 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:57:52 crc kubenswrapper[4733]: W1206 05:57:52.961448 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97d7986b_ebf4_441a_87f5_4dc655e13234.slice/crio-7efd3f694369cc30e6f036dca78ed16d660f8627d2cd1ac455ce244059b7b2f1 WatchSource:0}: Error finding container 7efd3f694369cc30e6f036dca78ed16d660f8627d2cd1ac455ce244059b7b2f1: Status 404 returned error can't find the container with id 7efd3f694369cc30e6f036dca78ed16d660f8627d2cd1ac455ce244059b7b2f1 Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.035211 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.036870 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.038401 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.040164 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.040353 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.040365 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.040761 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.041063 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.043098 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2b962" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.046711 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.177566 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.177681 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.177721 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.177916 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178034 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178057 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178074 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178151 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzb2\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178174 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178298 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.178364 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280331 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzb2\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280414 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280465 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280500 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280535 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280558 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280581 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.280628 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.281326 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.281295 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.281896 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.282405 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.282972 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.283045 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.283096 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.284793 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.283124 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.287634 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.288950 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.296754 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzb2\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.297512 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.299216 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.321710 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.325762 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.326678 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.329861 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.330031 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.331024 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.331196 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nv9mn" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.331518 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.332491 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.334876 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.356729 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.362627 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488825 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488930 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488951 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488978 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.488994 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnr5q\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.489045 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.489066 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.489084 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.489270 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.489449 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592040 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592121 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592176 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592197 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnr5q\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592353 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592373 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592433 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592462 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592515 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.592563 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.594835 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.595010 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.595538 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.596222 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.596505 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.597922 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.600067 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.602826 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.606534 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.607099 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.645226 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnr5q\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.663588 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.672284 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" event={"ID":"218fc499-054d-4805-b28c-6096d75e836d","Type":"ContainerStarted","Data":"82a3530cbbd633cd15fce821b707b9f6de417afde828f60620f557ca10ee8fea"} Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.679437 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" event={"ID":"97d7986b-ebf4-441a-87f5-4dc655e13234","Type":"ContainerStarted","Data":"7efd3f694369cc30e6f036dca78ed16d660f8627d2cd1ac455ce244059b7b2f1"} Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.811428 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 05:57:53 crc kubenswrapper[4733]: W1206 05:57:53.838484 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba773bb2_77c5_4562_b8ba_53428904d503.slice/crio-189cb4994ff5de9f6317b61675993004b93907ca6e430d892a0d3379a2c4096b WatchSource:0}: Error finding container 189cb4994ff5de9f6317b61675993004b93907ca6e430d892a0d3379a2c4096b: Status 404 returned error can't find the container with id 189cb4994ff5de9f6317b61675993004b93907ca6e430d892a0d3379a2c4096b Dec 06 05:57:53 crc kubenswrapper[4733]: I1206 05:57:53.963649 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.403222 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.704198 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerStarted","Data":"189cb4994ff5de9f6317b61675993004b93907ca6e430d892a0d3379a2c4096b"} Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.709343 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerStarted","Data":"7e1da629dfdc010176fafcb6a89d1e7dc9ea6192b88f1b40d0413e8a1e5b6352"} Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.847908 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.849877 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.851351 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.851874 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m2xt8" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.855761 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.858915 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.861726 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 05:57:54 crc kubenswrapper[4733]: I1206 05:57:54.865635 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029177 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029518 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029561 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029584 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4c68\" (UniqueName: \"kubernetes.io/projected/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kube-api-access-c4c68\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029639 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029664 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.029695 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130550 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130594 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130629 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130672 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130715 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.130749 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4c68\" (UniqueName: \"kubernetes.io/projected/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kube-api-access-c4c68\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.131950 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.132278 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.133035 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.134875 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.136380 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2b2baf7-95ad-4ff0-a72d-9232137735b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.145634 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.147810 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4c68\" (UniqueName: \"kubernetes.io/projected/b2b2baf7-95ad-4ff0-a72d-9232137735b6-kube-api-access-c4c68\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.148939 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b2baf7-95ad-4ff0-a72d-9232137735b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.192227 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b2b2baf7-95ad-4ff0-a72d-9232137735b6\") " pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.491830 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 05:57:55 crc kubenswrapper[4733]: I1206 05:57:55.910261 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 05:57:55 crc kubenswrapper[4733]: W1206 05:57:55.937459 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b2baf7_95ad_4ff0_a72d_9232137735b6.slice/crio-b67aa2dab7f5f4d027864fba5a5a2159755082306e52edec1058d87134fd7f08 WatchSource:0}: Error finding container b67aa2dab7f5f4d027864fba5a5a2159755082306e52edec1058d87134fd7f08: Status 404 returned error can't find the container with id b67aa2dab7f5f4d027864fba5a5a2159755082306e52edec1058d87134fd7f08 Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.205479 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.209133 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.210556 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.210840 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.211514 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c6gc2" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.214252 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.216946 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356315 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356444 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356516 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356573 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356643 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356757 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phs7b\" (UniqueName: \"kubernetes.io/projected/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kube-api-access-phs7b\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.356872 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.464929 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465004 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465041 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465080 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465103 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465132 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465177 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465248 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phs7b\" (UniqueName: \"kubernetes.io/projected/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kube-api-access-phs7b\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.465431 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.467670 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.468179 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.469094 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.472137 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de44369-4819-44c5-a1e5-3ea10b61cf0c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.484806 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.495894 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.496946 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de44369-4819-44c5-a1e5-3ea10b61cf0c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.499189 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.504506 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.505207 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phs7b\" (UniqueName: \"kubernetes.io/projected/3de44369-4819-44c5-a1e5-3ea10b61cf0c-kube-api-access-phs7b\") pod \"openstack-cell1-galera-0\" (UID: \"3de44369-4819-44c5-a1e5-3ea10b61cf0c\") " pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.509264 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.509285 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6p82k" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.509413 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.534371 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.553214 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.679490 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.679553 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-config-data\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.679632 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kolla-config\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.679648 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.679816 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxp2d\" (UniqueName: \"kubernetes.io/projected/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kube-api-access-wxp2d\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.745675 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2b2baf7-95ad-4ff0-a72d-9232137735b6","Type":"ContainerStarted","Data":"b67aa2dab7f5f4d027864fba5a5a2159755082306e52edec1058d87134fd7f08"} Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781085 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kolla-config\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781184 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxp2d\" (UniqueName: \"kubernetes.io/projected/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kube-api-access-wxp2d\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781220 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781248 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-config-data\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781978 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kolla-config\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.781992 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-config-data\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.790454 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.790452 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.797100 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxp2d\" (UniqueName: \"kubernetes.io/projected/56e3883e-d7a5-4735-aee1-9dbb5423c0fe-kube-api-access-wxp2d\") pod \"memcached-0\" (UID: \"56e3883e-d7a5-4735-aee1-9dbb5423c0fe\") " pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.873930 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 05:57:56 crc kubenswrapper[4733]: I1206 05:57:56.915904 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 05:57:56 crc kubenswrapper[4733]: W1206 05:57:56.933927 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3de44369_4819_44c5_a1e5_3ea10b61cf0c.slice/crio-7ff9a1d1f815cf6949166abe9ce6ca32e45695d2ff01a52e321552318a40f3b5 WatchSource:0}: Error finding container 7ff9a1d1f815cf6949166abe9ce6ca32e45695d2ff01a52e321552318a40f3b5: Status 404 returned error can't find the container with id 7ff9a1d1f815cf6949166abe9ce6ca32e45695d2ff01a52e321552318a40f3b5 Dec 06 05:57:57 crc kubenswrapper[4733]: I1206 05:57:57.345565 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 05:57:57 crc kubenswrapper[4733]: I1206 05:57:57.756456 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56e3883e-d7a5-4735-aee1-9dbb5423c0fe","Type":"ContainerStarted","Data":"2efbf871671f1859ba33fee7099d88a33bfcd61080d08c628b481a91bb95d794"} Dec 06 05:57:57 crc kubenswrapper[4733]: I1206 05:57:57.758565 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3de44369-4819-44c5-a1e5-3ea10b61cf0c","Type":"ContainerStarted","Data":"7ff9a1d1f815cf6949166abe9ce6ca32e45695d2ff01a52e321552318a40f3b5"} Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.151717 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.153009 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.157221 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.157527 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vcspf" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.212184 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c9c\" (UniqueName: \"kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c\") pod \"kube-state-metrics-0\" (UID: \"c00912c1-c2a3-44a7-a71e-e1e123680351\") " pod="openstack/kube-state-metrics-0" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.315051 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2c9c\" (UniqueName: \"kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c\") pod \"kube-state-metrics-0\" (UID: \"c00912c1-c2a3-44a7-a71e-e1e123680351\") " pod="openstack/kube-state-metrics-0" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.352131 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2c9c\" (UniqueName: \"kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c\") pod \"kube-state-metrics-0\" (UID: \"c00912c1-c2a3-44a7-a71e-e1e123680351\") " pod="openstack/kube-state-metrics-0" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.480582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 05:57:58 crc kubenswrapper[4733]: I1206 05:57:58.943511 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 05:57:59 crc kubenswrapper[4733]: I1206 05:57:59.780993 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c00912c1-c2a3-44a7-a71e-e1e123680351","Type":"ContainerStarted","Data":"ab2e1643e64ed15266e7eb48172bb1e5544d017fd7343fd850584cd5c17e4ef8"} Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.800987 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c00912c1-c2a3-44a7-a71e-e1e123680351","Type":"ContainerStarted","Data":"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de"} Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.801662 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.817696 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2ztw7"] Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.823210 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.827902 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jdbnf" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.828091 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.829427 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.839537 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7"] Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.844109 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.764412685 podStartE2EDuration="3.844086461s" podCreationTimestamp="2025-12-06 05:57:58 +0000 UTC" firstStartedPulling="2025-12-06 05:57:59.001624001 +0000 UTC m=+862.866835112" lastFinishedPulling="2025-12-06 05:58:01.081297777 +0000 UTC m=+864.946508888" observedRunningTime="2025-12-06 05:58:01.824808791 +0000 UTC m=+865.690019903" watchObservedRunningTime="2025-12-06 05:58:01.844086461 +0000 UTC m=+865.709297572" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.853251 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4wzzg"] Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.856104 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.863135 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4wzzg"] Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986171 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-lib\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986234 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-ovn-controller-tls-certs\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986272 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-combined-ca-bundle\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986449 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-log-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986553 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-etc-ovs\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986632 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986848 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5589595d-741e-424a-955a-6fc8b83c18c1-scripts\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986895 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7mj\" (UniqueName: \"kubernetes.io/projected/008ba5cf-a311-414d-9d06-a8ad4c038088-kube-api-access-gw7mj\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.986949 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.987030 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-run\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.987083 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/008ba5cf-a311-414d-9d06-a8ad4c038088-scripts\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.987167 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-log\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:01 crc kubenswrapper[4733]: I1206 05:58:01.987224 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28t7x\" (UniqueName: \"kubernetes.io/projected/5589595d-741e-424a-955a-6fc8b83c18c1-kube-api-access-28t7x\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.089495 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-etc-ovs\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090042 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.089980 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-etc-ovs\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090235 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090372 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5589595d-741e-424a-955a-6fc8b83c18c1-scripts\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090411 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7mj\" (UniqueName: \"kubernetes.io/projected/008ba5cf-a311-414d-9d06-a8ad4c038088-kube-api-access-gw7mj\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090430 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090455 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-run\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090558 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-run\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090592 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/008ba5cf-a311-414d-9d06-a8ad4c038088-scripts\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090612 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-log\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090607 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-run\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090762 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-log\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.090811 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28t7x\" (UniqueName: \"kubernetes.io/projected/5589595d-741e-424a-955a-6fc8b83c18c1-kube-api-access-28t7x\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091028 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-lib\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091087 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-ovn-controller-tls-certs\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091111 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-combined-ca-bundle\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091192 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/008ba5cf-a311-414d-9d06-a8ad4c038088-var-lib\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091196 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-log-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.091277 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5589595d-741e-424a-955a-6fc8b83c18c1-var-log-ovn\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.093234 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5589595d-741e-424a-955a-6fc8b83c18c1-scripts\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.094904 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/008ba5cf-a311-414d-9d06-a8ad4c038088-scripts\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.097975 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-ovn-controller-tls-certs\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.098327 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5589595d-741e-424a-955a-6fc8b83c18c1-combined-ca-bundle\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.104563 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7mj\" (UniqueName: \"kubernetes.io/projected/008ba5cf-a311-414d-9d06-a8ad4c038088-kube-api-access-gw7mj\") pod \"ovn-controller-ovs-4wzzg\" (UID: \"008ba5cf-a311-414d-9d06-a8ad4c038088\") " pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.110523 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28t7x\" (UniqueName: \"kubernetes.io/projected/5589595d-741e-424a-955a-6fc8b83c18c1-kube-api-access-28t7x\") pod \"ovn-controller-2ztw7\" (UID: \"5589595d-741e-424a-955a-6fc8b83c18c1\") " pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.161324 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.177270 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.718191 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.722124 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.726035 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.726195 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.727242 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.727765 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.728064 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.730066 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nrc8h" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909614 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909725 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909767 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909795 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909818 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-config\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909850 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.909995 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6d4x\" (UniqueName: \"kubernetes.io/projected/b87d6517-a2ed-458a-9a0e-0945f837a232-kube-api-access-h6d4x\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:02 crc kubenswrapper[4733]: I1206 05:58:02.910034 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.011880 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.011934 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-config\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.011978 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012006 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6d4x\" (UniqueName: \"kubernetes.io/projected/b87d6517-a2ed-458a-9a0e-0945f837a232-kube-api-access-h6d4x\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012039 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012083 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012147 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012189 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.012654 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.013566 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.014269 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-config\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.014750 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b87d6517-a2ed-458a-9a0e-0945f837a232-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.017070 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.017942 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.027578 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6517-a2ed-458a-9a0e-0945f837a232-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.033201 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6d4x\" (UniqueName: \"kubernetes.io/projected/b87d6517-a2ed-458a-9a0e-0945f837a232-kube-api-access-h6d4x\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.034809 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b87d6517-a2ed-458a-9a0e-0945f837a232\") " pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:03 crc kubenswrapper[4733]: I1206 05:58:03.041141 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.576907 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.578783 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.580918 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-glxc7" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.582099 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.582196 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.582255 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.597025 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.778948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.779007 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.779240 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4kl\" (UniqueName: \"kubernetes.io/projected/d0484be5-bcc0-4b5b-8aef-6c9573545b88-kube-api-access-7l4kl\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.779288 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.779555 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.779674 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.780063 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-config\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.780187 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881540 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881590 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881661 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4kl\" (UniqueName: \"kubernetes.io/projected/d0484be5-bcc0-4b5b-8aef-6c9573545b88-kube-api-access-7l4kl\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881780 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881830 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881891 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-config\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.881928 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.882601 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.883145 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.883296 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0484be5-bcc0-4b5b-8aef-6c9573545b88-config\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.883346 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.893314 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.893691 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.895356 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0484be5-bcc0-4b5b-8aef-6c9573545b88-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.908074 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4kl\" (UniqueName: \"kubernetes.io/projected/d0484be5-bcc0-4b5b-8aef-6c9573545b88-kube-api-access-7l4kl\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:05 crc kubenswrapper[4733]: I1206 05:58:05.911865 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d0484be5-bcc0-4b5b-8aef-6c9573545b88\") " pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:06 crc kubenswrapper[4733]: I1206 05:58:06.202408 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:08 crc kubenswrapper[4733]: I1206 05:58:08.497498 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 05:58:12 crc kubenswrapper[4733]: I1206 05:58:12.893698 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 05:58:13 crc kubenswrapper[4733]: E1206 05:58:13.338540 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos9/openstack-neutron-server:2e38c527ddf6e767040136ecf014e7b9" Dec 06 05:58:13 crc kubenswrapper[4733]: E1206 05:58:13.339034 4733 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos9/openstack-neutron-server:2e38c527ddf6e767040136ecf014e7b9" Dec 06 05:58:13 crc kubenswrapper[4733]: E1206 05:58:13.340389 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos9/openstack-neutron-server:2e38c527ddf6e767040136ecf014e7b9,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mhsz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-94b4f9f45-frp6w_openstack(2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 05:58:13 crc kubenswrapper[4733]: E1206 05:58:13.341724 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" podUID="2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.459121 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.646373 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config\") pod \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.646549 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhsz8\" (UniqueName: \"kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8\") pod \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\" (UID: \"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99\") " Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.647110 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config" (OuterVolumeSpecName: "config") pod "2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99" (UID: "2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.649041 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.667525 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8" (OuterVolumeSpecName: "kube-api-access-mhsz8") pod "2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99" (UID: "2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99"). InnerVolumeSpecName "kube-api-access-mhsz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.750001 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhsz8\" (UniqueName: \"kubernetes.io/projected/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99-kube-api-access-mhsz8\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.799960 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7"] Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.825280 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4wzzg"] Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.871162 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4bj7k"] Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.873333 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.877227 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.877664 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4bj7k"] Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.940653 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.961980 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r52b5\" (UniqueName: \"kubernetes.io/projected/6e9f6fed-9267-40ab-a945-b575dd0abc9a-kube-api-access-r52b5\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.962164 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-combined-ca-bundle\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.962214 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovs-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.962256 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovn-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.962323 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f6fed-9267-40ab-a945-b575dd0abc9a-config\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.962358 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.966125 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" event={"ID":"2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99","Type":"ContainerDied","Data":"43c4c70d2399307d560e754860dfd9a501df083dd8c537db72428e4ddf403e6c"} Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.966170 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94b4f9f45-frp6w" Dec 06 05:58:14 crc kubenswrapper[4733]: I1206 05:58:14.969165 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b87d6517-a2ed-458a-9a0e-0945f837a232","Type":"ContainerStarted","Data":"4f1551803f909c5175395cc632872875f76c12db2c771efcb27214c28b9c0157"} Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.009232 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.024460 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.028341 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94b4f9f45-frp6w"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.040582 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.042149 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.051527 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.060954 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064493 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r52b5\" (UniqueName: \"kubernetes.io/projected/6e9f6fed-9267-40ab-a945-b575dd0abc9a-kube-api-access-r52b5\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064572 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75fz\" (UniqueName: \"kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064661 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-combined-ca-bundle\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064683 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064711 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064732 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovs-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064769 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovn-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064819 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f6fed-9267-40ab-a945-b575dd0abc9a-config\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064850 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.064906 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.065224 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovs-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.065272 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6e9f6fed-9267-40ab-a945-b575dd0abc9a-ovn-rundir\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.065998 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f6fed-9267-40ab-a945-b575dd0abc9a-config\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.071630 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.074626 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9f6fed-9267-40ab-a945-b575dd0abc9a-combined-ca-bundle\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.082577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r52b5\" (UniqueName: \"kubernetes.io/projected/6e9f6fed-9267-40ab-a945-b575dd0abc9a-kube-api-access-r52b5\") pod \"ovn-controller-metrics-4bj7k\" (UID: \"6e9f6fed-9267-40ab-a945-b575dd0abc9a\") " pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.142733 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.159455 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.160906 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.163554 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166152 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166215 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166284 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75fz\" (UniqueName: \"kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166338 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166782 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166931 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.166986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.167018 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.167087 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.167179 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9t6\" (UniqueName: \"kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.167847 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.167872 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.170376 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.190651 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75fz\" (UniqueName: \"kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz\") pod \"dnsmasq-dns-767d7fb4d9-dkbdm\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.201721 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4bj7k" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.269794 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.269961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.270103 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.270241 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9t6\" (UniqueName: \"kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.270319 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.270817 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.270842 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.271479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.271603 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.285998 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9t6\" (UniqueName: \"kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6\") pod \"dnsmasq-dns-78d59ccb8c-62d2b\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.372698 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:15 crc kubenswrapper[4733]: I1206 05:58:15.484830 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:16 crc kubenswrapper[4733]: I1206 05:58:16.496908 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99" path="/var/lib/kubelet/pods/2fe174a3-4d0f-423e-8be5-8dd6ce0c0a99/volumes" Dec 06 05:58:18 crc kubenswrapper[4733]: W1206 05:58:18.441342 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0484be5_bcc0_4b5b_8aef_6c9573545b88.slice/crio-66a6bc435d7799451ade3f44de4e3e3be1b09da55fb7a11975357e37d204ceda WatchSource:0}: Error finding container 66a6bc435d7799451ade3f44de4e3e3be1b09da55fb7a11975357e37d204ceda: Status 404 returned error can't find the container with id 66a6bc435d7799451ade3f44de4e3e3be1b09da55fb7a11975357e37d204ceda Dec 06 05:58:18 crc kubenswrapper[4733]: W1206 05:58:18.442967 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5589595d_741e_424a_955a_6fc8b83c18c1.slice/crio-0f75e61e5eedbe10de8fac563cbf9d5ed53b1b23794bdd0d78584e7ab929c63e WatchSource:0}: Error finding container 0f75e61e5eedbe10de8fac563cbf9d5ed53b1b23794bdd0d78584e7ab929c63e: Status 404 returned error can't find the container with id 0f75e61e5eedbe10de8fac563cbf9d5ed53b1b23794bdd0d78584e7ab929c63e Dec 06 05:58:18 crc kubenswrapper[4733]: W1206 05:58:18.446115 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod008ba5cf_a311_414d_9d06_a8ad4c038088.slice/crio-1c8dea88082d3673cffe0f3c9a993e4164ab1cb016aa99aee3cdf3beace51619 WatchSource:0}: Error finding container 1c8dea88082d3673cffe0f3c9a993e4164ab1cb016aa99aee3cdf3beace51619: Status 404 returned error can't find the container with id 1c8dea88082d3673cffe0f3c9a993e4164ab1cb016aa99aee3cdf3beace51619 Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.000984 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4wzzg" event={"ID":"008ba5cf-a311-414d-9d06-a8ad4c038088","Type":"ContainerStarted","Data":"1c8dea88082d3673cffe0f3c9a993e4164ab1cb016aa99aee3cdf3beace51619"} Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.002982 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7" event={"ID":"5589595d-741e-424a-955a-6fc8b83c18c1","Type":"ContainerStarted","Data":"0f75e61e5eedbe10de8fac563cbf9d5ed53b1b23794bdd0d78584e7ab929c63e"} Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.004547 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d0484be5-bcc0-4b5b-8aef-6c9573545b88","Type":"ContainerStarted","Data":"66a6bc435d7799451ade3f44de4e3e3be1b09da55fb7a11975357e37d204ceda"} Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.587653 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4bj7k"] Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.660734 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:19 crc kubenswrapper[4733]: I1206 05:58:19.667067 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:19 crc kubenswrapper[4733]: W1206 05:58:19.695088 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715d93bf_4fc7_4bc5_adb5_4504c9c954ea.slice/crio-2473ad330c062a4cc840408168ba60d2f00daa78847d9d158c92924e0a6b49d2 WatchSource:0}: Error finding container 2473ad330c062a4cc840408168ba60d2f00daa78847d9d158c92924e0a6b49d2: Status 404 returned error can't find the container with id 2473ad330c062a4cc840408168ba60d2f00daa78847d9d158c92924e0a6b49d2 Dec 06 05:58:19 crc kubenswrapper[4733]: W1206 05:58:19.707706 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod645e93e6_ca60_43c5_be46_24b1c34fdd7c.slice/crio-a556e645a23412986abbc0c3313f4835e6d40845933c718dc89636688cc8bafc WatchSource:0}: Error finding container a556e645a23412986abbc0c3313f4835e6d40845933c718dc89636688cc8bafc: Status 404 returned error can't find the container with id a556e645a23412986abbc0c3313f4835e6d40845933c718dc89636688cc8bafc Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.016181 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2b2baf7-95ad-4ff0-a72d-9232137735b6","Type":"ContainerStarted","Data":"e80beb4c192ded4b5b4916b9efc21103ffa34eaf326d539cf842ff1acbfb222a"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.018199 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4bj7k" event={"ID":"6e9f6fed-9267-40ab-a945-b575dd0abc9a","Type":"ContainerStarted","Data":"ef51fd2521206884ca6e94ea796951b7c769b6b8765eddcc7dde62b988c99018"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.021170 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" event={"ID":"715d93bf-4fc7-4bc5-adb5-4504c9c954ea","Type":"ContainerStarted","Data":"2473ad330c062a4cc840408168ba60d2f00daa78847d9d158c92924e0a6b49d2"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.024346 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3de44369-4819-44c5-a1e5-3ea10b61cf0c","Type":"ContainerStarted","Data":"0c828df1acc79d99d5e136bcbe14cabf7157ddb6a9fdc2ac994f3174ae8914ce"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.026587 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56e3883e-d7a5-4735-aee1-9dbb5423c0fe","Type":"ContainerStarted","Data":"a2407a216d1bd45c6625990f9f197866b6ae794ba159ec622e12a5acfd712c7d"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.026695 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.028152 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" event={"ID":"645e93e6-ca60-43c5-be46-24b1c34fdd7c","Type":"ContainerStarted","Data":"a556e645a23412986abbc0c3313f4835e6d40845933c718dc89636688cc8bafc"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.030987 4733 generic.go:334] "Generic (PLEG): container finished" podID="218fc499-054d-4805-b28c-6096d75e836d" containerID="70a7a40245553cca94a7dbb086d028350d39109b537828b7f2f6e0830f68f6ed" exitCode=0 Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.031057 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" event={"ID":"218fc499-054d-4805-b28c-6096d75e836d","Type":"ContainerDied","Data":"70a7a40245553cca94a7dbb086d028350d39109b537828b7f2f6e0830f68f6ed"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.043564 4733 generic.go:334] "Generic (PLEG): container finished" podID="cbab357f-c31f-4dff-9255-f19667d52997" containerID="14d37c36a60c09cf24b7bcfe5d7b51a13465e93c1ade02dae93e6f5618b810b5" exitCode=0 Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.043663 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6947456757-nnjdx" event={"ID":"cbab357f-c31f-4dff-9255-f19667d52997","Type":"ContainerDied","Data":"14d37c36a60c09cf24b7bcfe5d7b51a13465e93c1ade02dae93e6f5618b810b5"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.046893 4733 generic.go:334] "Generic (PLEG): container finished" podID="97d7986b-ebf4-441a-87f5-4dc655e13234" containerID="4a3a72e9b08868ec4493f5fac4a43feeb143a9a74fdc4d06e22099b6bde0abc8" exitCode=0 Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.046950 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" event={"ID":"97d7986b-ebf4-441a-87f5-4dc655e13234","Type":"ContainerDied","Data":"4a3a72e9b08868ec4493f5fac4a43feeb143a9a74fdc4d06e22099b6bde0abc8"} Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.065419 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.264737874 podStartE2EDuration="24.065389886s" podCreationTimestamp="2025-12-06 05:57:56 +0000 UTC" firstStartedPulling="2025-12-06 05:57:57.361005738 +0000 UTC m=+861.226216849" lastFinishedPulling="2025-12-06 05:58:19.16165775 +0000 UTC m=+883.026868861" observedRunningTime="2025-12-06 05:58:20.055625635 +0000 UTC m=+883.920836745" watchObservedRunningTime="2025-12-06 05:58:20.065389886 +0000 UTC m=+883.930600997" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.554634 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.563998 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.565959 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673124 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc\") pod \"97d7986b-ebf4-441a-87f5-4dc655e13234\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673236 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9t6c\" (UniqueName: \"kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c\") pod \"cbab357f-c31f-4dff-9255-f19667d52997\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673266 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc\") pod \"218fc499-054d-4805-b28c-6096d75e836d\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673326 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc\") pod \"cbab357f-c31f-4dff-9255-f19667d52997\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673359 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config\") pod \"97d7986b-ebf4-441a-87f5-4dc655e13234\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673408 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config\") pod \"218fc499-054d-4805-b28c-6096d75e836d\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673506 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz2rw\" (UniqueName: \"kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw\") pod \"218fc499-054d-4805-b28c-6096d75e836d\" (UID: \"218fc499-054d-4805-b28c-6096d75e836d\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673534 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v964g\" (UniqueName: \"kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g\") pod \"97d7986b-ebf4-441a-87f5-4dc655e13234\" (UID: \"97d7986b-ebf4-441a-87f5-4dc655e13234\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.673561 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config\") pod \"cbab357f-c31f-4dff-9255-f19667d52997\" (UID: \"cbab357f-c31f-4dff-9255-f19667d52997\") " Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.679274 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c" (OuterVolumeSpecName: "kube-api-access-k9t6c") pod "cbab357f-c31f-4dff-9255-f19667d52997" (UID: "cbab357f-c31f-4dff-9255-f19667d52997"). InnerVolumeSpecName "kube-api-access-k9t6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.680158 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw" (OuterVolumeSpecName: "kube-api-access-sz2rw") pod "218fc499-054d-4805-b28c-6096d75e836d" (UID: "218fc499-054d-4805-b28c-6096d75e836d"). InnerVolumeSpecName "kube-api-access-sz2rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.680781 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g" (OuterVolumeSpecName: "kube-api-access-v964g") pod "97d7986b-ebf4-441a-87f5-4dc655e13234" (UID: "97d7986b-ebf4-441a-87f5-4dc655e13234"). InnerVolumeSpecName "kube-api-access-v964g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.692413 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbab357f-c31f-4dff-9255-f19667d52997" (UID: "cbab357f-c31f-4dff-9255-f19667d52997"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.693437 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config" (OuterVolumeSpecName: "config") pod "218fc499-054d-4805-b28c-6096d75e836d" (UID: "218fc499-054d-4805-b28c-6096d75e836d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.694347 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "218fc499-054d-4805-b28c-6096d75e836d" (UID: "218fc499-054d-4805-b28c-6096d75e836d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.698379 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config" (OuterVolumeSpecName: "config") pod "97d7986b-ebf4-441a-87f5-4dc655e13234" (UID: "97d7986b-ebf4-441a-87f5-4dc655e13234"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.700050 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97d7986b-ebf4-441a-87f5-4dc655e13234" (UID: "97d7986b-ebf4-441a-87f5-4dc655e13234"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.701830 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config" (OuterVolumeSpecName: "config") pod "cbab357f-c31f-4dff-9255-f19667d52997" (UID: "cbab357f-c31f-4dff-9255-f19667d52997"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776111 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz2rw\" (UniqueName: \"kubernetes.io/projected/218fc499-054d-4805-b28c-6096d75e836d-kube-api-access-sz2rw\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776146 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v964g\" (UniqueName: \"kubernetes.io/projected/97d7986b-ebf4-441a-87f5-4dc655e13234-kube-api-access-v964g\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776158 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776168 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776179 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9t6c\" (UniqueName: \"kubernetes.io/projected/cbab357f-c31f-4dff-9255-f19667d52997-kube-api-access-k9t6c\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776190 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776201 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbab357f-c31f-4dff-9255-f19667d52997-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776210 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d7986b-ebf4-441a-87f5-4dc655e13234-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:20 crc kubenswrapper[4733]: I1206 05:58:20.776219 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/218fc499-054d-4805-b28c-6096d75e836d-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.055608 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerStarted","Data":"3414692aed66b4eeb2d86e147784525b258dc75c775f3e1178bcdad27a734b53"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.059404 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerStarted","Data":"28d656946022d0e45d8ae7cd9d5210bbeea6770c3efa37b73f252df6528fed96"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.062079 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" event={"ID":"97d7986b-ebf4-441a-87f5-4dc655e13234","Type":"ContainerDied","Data":"7efd3f694369cc30e6f036dca78ed16d660f8627d2cd1ac455ce244059b7b2f1"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.062092 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9886d5bf-qnvxm" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.062126 4733 scope.go:117] "RemoveContainer" containerID="4a3a72e9b08868ec4493f5fac4a43feeb143a9a74fdc4d06e22099b6bde0abc8" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.064619 4733 generic.go:334] "Generic (PLEG): container finished" podID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerID="c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd" exitCode=0 Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.064668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" event={"ID":"645e93e6-ca60-43c5-be46-24b1c34fdd7c","Type":"ContainerDied","Data":"c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.066853 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" event={"ID":"218fc499-054d-4805-b28c-6096d75e836d","Type":"ContainerDied","Data":"82a3530cbbd633cd15fce821b707b9f6de417afde828f60620f557ca10ee8fea"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.066908 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dc666865-rp6cx" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.077688 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6947456757-nnjdx" event={"ID":"cbab357f-c31f-4dff-9255-f19667d52997","Type":"ContainerDied","Data":"7f1b8e70dbe1b25a967b3f50c913a1e43ca90c0f87e09ac215ed399c38d1e731"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.077963 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6947456757-nnjdx" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.083631 4733 generic.go:334] "Generic (PLEG): container finished" podID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerID="53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f" exitCode=0 Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.083736 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" event={"ID":"715d93bf-4fc7-4bc5-adb5-4504c9c954ea","Type":"ContainerDied","Data":"53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f"} Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.111511 4733 scope.go:117] "RemoveContainer" containerID="70a7a40245553cca94a7dbb086d028350d39109b537828b7f2f6e0830f68f6ed" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.157685 4733 scope.go:117] "RemoveContainer" containerID="14d37c36a60c09cf24b7bcfe5d7b51a13465e93c1ade02dae93e6f5618b810b5" Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.221813 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.226742 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55dc666865-rp6cx"] Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.235065 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.242368 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6947456757-nnjdx"] Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.253293 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:58:21 crc kubenswrapper[4733]: I1206 05:58:21.257570 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d9886d5bf-qnvxm"] Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.110977 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" event={"ID":"715d93bf-4fc7-4bc5-adb5-4504c9c954ea","Type":"ContainerStarted","Data":"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2"} Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.111527 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.120353 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" event={"ID":"645e93e6-ca60-43c5-be46-24b1c34fdd7c","Type":"ContainerStarted","Data":"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430"} Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.120449 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.149007 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" podStartSLOduration=7.148706923 podStartE2EDuration="7.148706923s" podCreationTimestamp="2025-12-06 05:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:58:22.1306025 +0000 UTC m=+885.995813611" watchObservedRunningTime="2025-12-06 05:58:22.148706923 +0000 UTC m=+886.013918024" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.152623 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" podStartSLOduration=7.152605474 podStartE2EDuration="7.152605474s" podCreationTimestamp="2025-12-06 05:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:58:22.150051601 +0000 UTC m=+886.015262713" watchObservedRunningTime="2025-12-06 05:58:22.152605474 +0000 UTC m=+886.017816586" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.498919 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218fc499-054d-4805-b28c-6096d75e836d" path="/var/lib/kubelet/pods/218fc499-054d-4805-b28c-6096d75e836d/volumes" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.499587 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d7986b-ebf4-441a-87f5-4dc655e13234" path="/var/lib/kubelet/pods/97d7986b-ebf4-441a-87f5-4dc655e13234/volumes" Dec 06 05:58:22 crc kubenswrapper[4733]: I1206 05:58:22.500146 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbab357f-c31f-4dff-9255-f19667d52997" path="/var/lib/kubelet/pods/cbab357f-c31f-4dff-9255-f19667d52997/volumes" Dec 06 05:58:23 crc kubenswrapper[4733]: I1206 05:58:23.135957 4733 generic.go:334] "Generic (PLEG): container finished" podID="3de44369-4819-44c5-a1e5-3ea10b61cf0c" containerID="0c828df1acc79d99d5e136bcbe14cabf7157ddb6a9fdc2ac994f3174ae8914ce" exitCode=0 Dec 06 05:58:23 crc kubenswrapper[4733]: I1206 05:58:23.136050 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3de44369-4819-44c5-a1e5-3ea10b61cf0c","Type":"ContainerDied","Data":"0c828df1acc79d99d5e136bcbe14cabf7157ddb6a9fdc2ac994f3174ae8914ce"} Dec 06 05:58:23 crc kubenswrapper[4733]: I1206 05:58:23.141605 4733 generic.go:334] "Generic (PLEG): container finished" podID="b2b2baf7-95ad-4ff0-a72d-9232137735b6" containerID="e80beb4c192ded4b5b4916b9efc21103ffa34eaf326d539cf842ff1acbfb222a" exitCode=0 Dec 06 05:58:23 crc kubenswrapper[4733]: I1206 05:58:23.141674 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2b2baf7-95ad-4ff0-a72d-9232137735b6","Type":"ContainerDied","Data":"e80beb4c192ded4b5b4916b9efc21103ffa34eaf326d539cf842ff1acbfb222a"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.177437 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3de44369-4819-44c5-a1e5-3ea10b61cf0c","Type":"ContainerStarted","Data":"28709586181a7960bc4e946890e515ee45419b7a83d612657d736a9116e7f098"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.179860 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b2b2baf7-95ad-4ff0-a72d-9232137735b6","Type":"ContainerStarted","Data":"3575af0dcdc66075a15bd9f44a76f6a3d10e16e18141df0cefd319c323b97276"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.182743 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4bj7k" event={"ID":"6e9f6fed-9267-40ab-a945-b575dd0abc9a","Type":"ContainerStarted","Data":"cd806faf41eee9da8c4fc6470ac258aa6e9a3e97752fd48b0f8337fc56e492ae"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.187787 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b87d6517-a2ed-458a-9a0e-0945f837a232","Type":"ContainerStarted","Data":"2e6dde2336d756653d7d0233a0e4ca931309a9e77aab4803c037c1322190c1cf"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.187818 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b87d6517-a2ed-458a-9a0e-0945f837a232","Type":"ContainerStarted","Data":"62d4b7be85b41d202da13ec99cb5628e907434a1ce0eb9993fd0add3c82ca85f"} Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.237152 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.040659789 podStartE2EDuration="30.23710839s" podCreationTimestamp="2025-12-06 05:57:55 +0000 UTC" firstStartedPulling="2025-12-06 05:57:56.943418072 +0000 UTC m=+860.808629184" lastFinishedPulling="2025-12-06 05:58:19.139866674 +0000 UTC m=+883.005077785" observedRunningTime="2025-12-06 05:58:25.212682078 +0000 UTC m=+889.077893189" watchObservedRunningTime="2025-12-06 05:58:25.23710839 +0000 UTC m=+889.102319501" Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.250511 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.900819283 podStartE2EDuration="32.250494351s" podCreationTimestamp="2025-12-06 05:57:53 +0000 UTC" firstStartedPulling="2025-12-06 05:57:55.943279564 +0000 UTC m=+859.808490675" lastFinishedPulling="2025-12-06 05:58:19.292954632 +0000 UTC m=+883.158165743" observedRunningTime="2025-12-06 05:58:25.23852845 +0000 UTC m=+889.103739561" watchObservedRunningTime="2025-12-06 05:58:25.250494351 +0000 UTC m=+889.115705462" Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.268796 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.456321925 podStartE2EDuration="24.26877201s" podCreationTimestamp="2025-12-06 05:58:01 +0000 UTC" firstStartedPulling="2025-12-06 05:58:14.394819096 +0000 UTC m=+878.260030207" lastFinishedPulling="2025-12-06 05:58:24.207269181 +0000 UTC m=+888.072480292" observedRunningTime="2025-12-06 05:58:25.264961775 +0000 UTC m=+889.130172887" watchObservedRunningTime="2025-12-06 05:58:25.26877201 +0000 UTC m=+889.133983110" Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.285654 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4bj7k" podStartSLOduration=6.684407221 podStartE2EDuration="11.285639177s" podCreationTimestamp="2025-12-06 05:58:14 +0000 UTC" firstStartedPulling="2025-12-06 05:58:19.639826012 +0000 UTC m=+883.505037123" lastFinishedPulling="2025-12-06 05:58:24.241057968 +0000 UTC m=+888.106269079" observedRunningTime="2025-12-06 05:58:25.279601593 +0000 UTC m=+889.144812704" watchObservedRunningTime="2025-12-06 05:58:25.285639177 +0000 UTC m=+889.150850288" Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.492266 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 05:58:25 crc kubenswrapper[4733]: I1206 05:58:25.492326 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.197655 4733 generic.go:334] "Generic (PLEG): container finished" podID="008ba5cf-a311-414d-9d06-a8ad4c038088" containerID="0d10ef3bfb88b3d653bc92d25af323a71e3fb582a47d59e8b7ae9598282ee8b5" exitCode=0 Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.197765 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4wzzg" event={"ID":"008ba5cf-a311-414d-9d06-a8ad4c038088","Type":"ContainerDied","Data":"0d10ef3bfb88b3d653bc92d25af323a71e3fb582a47d59e8b7ae9598282ee8b5"} Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.201196 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7" event={"ID":"5589595d-741e-424a-955a-6fc8b83c18c1","Type":"ContainerStarted","Data":"178f6c2654f8b9ce16888dc56f65de1ac54b438d02a981271c0bd7dbe75092fa"} Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.202235 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2ztw7" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.231911 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d0484be5-bcc0-4b5b-8aef-6c9573545b88","Type":"ContainerStarted","Data":"e11cd074f933267168b605e2543d29dbf067de9cc0c45801db38bda248631c59"} Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.231986 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d0484be5-bcc0-4b5b-8aef-6c9573545b88","Type":"ContainerStarted","Data":"f216eb24d7b2df6f5c005f824aef66d93a50f685411170b8d5fd7a9723e47ada"} Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.236898 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2ztw7" podStartSLOduration=18.635748222 podStartE2EDuration="25.236877864s" podCreationTimestamp="2025-12-06 05:58:01 +0000 UTC" firstStartedPulling="2025-12-06 05:58:18.449985164 +0000 UTC m=+882.315196275" lastFinishedPulling="2025-12-06 05:58:25.051114805 +0000 UTC m=+888.916325917" observedRunningTime="2025-12-06 05:58:26.231268988 +0000 UTC m=+890.096480099" watchObservedRunningTime="2025-12-06 05:58:26.236877864 +0000 UTC m=+890.102088975" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.251334 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.63964383 podStartE2EDuration="22.251320513s" podCreationTimestamp="2025-12-06 05:58:04 +0000 UTC" firstStartedPulling="2025-12-06 05:58:18.449009701 +0000 UTC m=+882.314220802" lastFinishedPulling="2025-12-06 05:58:25.060686374 +0000 UTC m=+888.925897485" observedRunningTime="2025-12-06 05:58:26.250600469 +0000 UTC m=+890.115811580" watchObservedRunningTime="2025-12-06 05:58:26.251320513 +0000 UTC m=+890.116531623" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.553666 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.554065 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 05:58:26 crc kubenswrapper[4733]: I1206 05:58:26.875620 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.041929 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.071973 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.203261 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.233248 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.246258 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4wzzg" event={"ID":"008ba5cf-a311-414d-9d06-a8ad4c038088","Type":"ContainerStarted","Data":"5ede1990a299f9cc1223065ecf7fd24f9ae38f508566975db1244a258b2e9383"} Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.246330 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4wzzg" event={"ID":"008ba5cf-a311-414d-9d06-a8ad4c038088","Type":"ContainerStarted","Data":"669adb2efc6e0c4e2041a08a46c42e45eca004d28f2cdee094c29cfb96ac0c2b"} Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.246715 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.279037 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4wzzg" podStartSLOduration=19.666324328 podStartE2EDuration="26.279017553s" podCreationTimestamp="2025-12-06 05:58:01 +0000 UTC" firstStartedPulling="2025-12-06 05:58:18.449791821 +0000 UTC m=+882.315002932" lastFinishedPulling="2025-12-06 05:58:25.062485046 +0000 UTC m=+888.927696157" observedRunningTime="2025-12-06 05:58:27.2746553 +0000 UTC m=+891.139866401" watchObservedRunningTime="2025-12-06 05:58:27.279017553 +0000 UTC m=+891.144228664" Dec 06 05:58:27 crc kubenswrapper[4733]: I1206 05:58:27.319643 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.254597 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.255169 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.368786 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.369059 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="dnsmasq-dns" containerID="cri-o://7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430" gracePeriod=10 Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.373685 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.438355 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:58:28 crc kubenswrapper[4733]: E1206 05:58:28.439052 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218fc499-054d-4805-b28c-6096d75e836d" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439084 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="218fc499-054d-4805-b28c-6096d75e836d" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: E1206 05:58:28.439127 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d7986b-ebf4-441a-87f5-4dc655e13234" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439133 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d7986b-ebf4-441a-87f5-4dc655e13234" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: E1206 05:58:28.439151 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbab357f-c31f-4dff-9255-f19667d52997" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439159 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbab357f-c31f-4dff-9255-f19667d52997" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439517 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="218fc499-054d-4805-b28c-6096d75e836d" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439546 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbab357f-c31f-4dff-9255-f19667d52997" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.439563 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d7986b-ebf4-441a-87f5-4dc655e13234" containerName="init" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.443588 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.460951 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.516912 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.516970 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.517107 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xsv\" (UniqueName: \"kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.517195 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.517250 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.618691 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.618732 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.618781 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xsv\" (UniqueName: \"kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.619112 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.620229 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.619818 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.620149 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.619673 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.621012 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.648531 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xsv\" (UniqueName: \"kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv\") pod \"dnsmasq-dns-67fb8b8965-bmdkf\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.759821 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.825563 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.925168 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d75fz\" (UniqueName: \"kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz\") pod \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.925525 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc\") pod \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.925720 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb\") pod \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.925741 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config\") pod \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\" (UID: \"645e93e6-ca60-43c5-be46-24b1c34fdd7c\") " Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.930548 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz" (OuterVolumeSpecName: "kube-api-access-d75fz") pod "645e93e6-ca60-43c5-be46-24b1c34fdd7c" (UID: "645e93e6-ca60-43c5-be46-24b1c34fdd7c"). InnerVolumeSpecName "kube-api-access-d75fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.961037 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "645e93e6-ca60-43c5-be46-24b1c34fdd7c" (UID: "645e93e6-ca60-43c5-be46-24b1c34fdd7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.966720 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "645e93e6-ca60-43c5-be46-24b1c34fdd7c" (UID: "645e93e6-ca60-43c5-be46-24b1c34fdd7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:28 crc kubenswrapper[4733]: I1206 05:58:28.977384 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config" (OuterVolumeSpecName: "config") pod "645e93e6-ca60-43c5-be46-24b1c34fdd7c" (UID: "645e93e6-ca60-43c5-be46-24b1c34fdd7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.028027 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.028083 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.028094 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d75fz\" (UniqueName: \"kubernetes.io/projected/645e93e6-ca60-43c5-be46-24b1c34fdd7c-kube-api-access-d75fz\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.028108 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/645e93e6-ca60-43c5-be46-24b1c34fdd7c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.158280 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.266346 4733 generic.go:334] "Generic (PLEG): container finished" podID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerID="7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430" exitCode=0 Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.266408 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" event={"ID":"645e93e6-ca60-43c5-be46-24b1c34fdd7c","Type":"ContainerDied","Data":"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430"} Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.266477 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" event={"ID":"645e93e6-ca60-43c5-be46-24b1c34fdd7c","Type":"ContainerDied","Data":"a556e645a23412986abbc0c3313f4835e6d40845933c718dc89636688cc8bafc"} Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.266476 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767d7fb4d9-dkbdm" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.266502 4733 scope.go:117] "RemoveContainer" containerID="7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.267869 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" event={"ID":"e99bfc04-55e0-4ff7-be22-22c6e9cc8100","Type":"ContainerStarted","Data":"3f430cc3eabcb1377646beaf6293a33e0dc0d9f133d57b0c218dab7c60a65836"} Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.287830 4733 scope.go:117] "RemoveContainer" containerID="c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.298217 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.303549 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-767d7fb4d9-dkbdm"] Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.338208 4733 scope.go:117] "RemoveContainer" containerID="7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430" Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.340517 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430\": container with ID starting with 7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430 not found: ID does not exist" containerID="7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.340587 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430"} err="failed to get container status \"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430\": rpc error: code = NotFound desc = could not find container \"7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430\": container with ID starting with 7ad1c8354e87106f4d9b7f163decc6869123f95537445b26b9b6b18e14ec3430 not found: ID does not exist" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.340623 4733 scope.go:117] "RemoveContainer" containerID="c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd" Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.343674 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd\": container with ID starting with c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd not found: ID does not exist" containerID="c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.343720 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd"} err="failed to get container status \"c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd\": rpc error: code = NotFound desc = could not find container \"c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd\": container with ID starting with c3faa7f083138de0de0efc79c436ea9ef4115d63c49c18cf6f940c1dd04f35bd not found: ID does not exist" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.523064 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.523409 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="init" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.523427 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="init" Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.523441 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="dnsmasq-dns" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.523447 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="dnsmasq-dns" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.523604 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" containerName="dnsmasq-dns" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.527483 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.530247 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.530653 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.530868 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ll6fz" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.530886 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.555223 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.579716 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.638486 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-lock\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.638603 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.638641 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.638851 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-cache\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.638944 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh5w8\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-kube-api-access-sh5w8\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.645867 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.740768 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-cache\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.740818 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh5w8\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-kube-api-access-sh5w8\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.740884 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-lock\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.740986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.741026 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.741254 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-cache\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.741266 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.741337 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 05:58:29 crc kubenswrapper[4733]: E1206 05:58:29.741389 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift podName:a55915f4-28cf-4343-aefa-e6b145b3ccf1 nodeName:}" failed. No retries permitted until 2025-12-06 05:58:30.241371624 +0000 UTC m=+894.106582735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift") pod "swift-storage-0" (UID: "a55915f4-28cf-4343-aefa-e6b145b3ccf1") : configmap "swift-ring-files" not found Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.741472 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.741670 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a55915f4-28cf-4343-aefa-e6b145b3ccf1-lock\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.755471 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh5w8\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-kube-api-access-sh5w8\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:29 crc kubenswrapper[4733]: I1206 05:58:29.759076 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.023144 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s28f8"] Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.024112 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.026152 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.026155 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.026583 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.031675 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s28f8"] Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.149966 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150015 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150067 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150238 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150467 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150528 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.150610 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.233171 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.233619 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.252907 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253050 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253096 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253191 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253233 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253263 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.253343 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: E1206 05:58:30.254151 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 05:58:30 crc kubenswrapper[4733]: E1206 05:58:30.254205 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 05:58:30 crc kubenswrapper[4733]: E1206 05:58:30.254488 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift podName:a55915f4-28cf-4343-aefa-e6b145b3ccf1 nodeName:}" failed. No retries permitted until 2025-12-06 05:58:31.254467644 +0000 UTC m=+895.119678756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift") pod "swift-storage-0" (UID: "a55915f4-28cf-4343-aefa-e6b145b3ccf1") : configmap "swift-ring-files" not found Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.254652 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.254770 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.254936 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.258915 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.259131 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.259169 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.270289 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t\") pod \"swift-ring-rebalance-s28f8\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.271158 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.278739 4733 generic.go:334] "Generic (PLEG): container finished" podID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerID="8430a895840b043724f2787a4d77bd43cddfeb6bdecc52d1e0116db65ec0b949" exitCode=0 Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.278785 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" event={"ID":"e99bfc04-55e0-4ff7-be22-22c6e9cc8100","Type":"ContainerDied","Data":"8430a895840b043724f2787a4d77bd43cddfeb6bdecc52d1e0116db65ec0b949"} Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.339719 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.498686 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645e93e6-ca60-43c5-be46-24b1c34fdd7c" path="/var/lib/kubelet/pods/645e93e6-ca60-43c5-be46-24b1c34fdd7c/volumes" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.499848 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:30 crc kubenswrapper[4733]: I1206 05:58:30.755831 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s28f8"] Dec 06 05:58:30 crc kubenswrapper[4733]: W1206 05:58:30.757387 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d720cd5_bb4e_449f_86a3_c9cff2acfada.slice/crio-398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8 WatchSource:0}: Error finding container 398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8: Status 404 returned error can't find the container with id 398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8 Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.270703 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:31 crc kubenswrapper[4733]: E1206 05:58:31.270953 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 05:58:31 crc kubenswrapper[4733]: E1206 05:58:31.270997 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 05:58:31 crc kubenswrapper[4733]: E1206 05:58:31.271088 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift podName:a55915f4-28cf-4343-aefa-e6b145b3ccf1 nodeName:}" failed. No retries permitted until 2025-12-06 05:58:33.271069011 +0000 UTC m=+897.136280121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift") pod "swift-storage-0" (UID: "a55915f4-28cf-4343-aefa-e6b145b3ccf1") : configmap "swift-ring-files" not found Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.304335 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" event={"ID":"e99bfc04-55e0-4ff7-be22-22c6e9cc8100","Type":"ContainerStarted","Data":"12c65d99bf2519aa1372aa4e393f577ad266a0424ace9155a70da7812c3860f5"} Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.304431 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.306287 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s28f8" event={"ID":"1d720cd5-bb4e-449f-86a3-c9cff2acfada","Type":"ContainerStarted","Data":"398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8"} Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.331814 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" podStartSLOduration=3.331803045 podStartE2EDuration="3.331803045s" podCreationTimestamp="2025-12-06 05:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:58:31.32708855 +0000 UTC m=+895.192299661" watchObservedRunningTime="2025-12-06 05:58:31.331803045 +0000 UTC m=+895.197014156" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.403809 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.405248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.418498 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.474365 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.474411 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.474469 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5p88\" (UniqueName: \"kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.576698 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.576746 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.576761 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.576976 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5p88\" (UniqueName: \"kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.577026 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.594800 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5p88\" (UniqueName: \"kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88\") pod \"redhat-marketplace-d5pfq\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:31 crc kubenswrapper[4733]: I1206 05:58:31.723539 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:32 crc kubenswrapper[4733]: I1206 05:58:32.137367 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:32 crc kubenswrapper[4733]: W1206 05:58:32.169501 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6d5166f_3501_4698_8da3_499dad07de3b.slice/crio-405a31216d6ff3ad5bc90d8b7dbc032a33ebb1e6df82d1fe4e9f5efcaa59330c WatchSource:0}: Error finding container 405a31216d6ff3ad5bc90d8b7dbc032a33ebb1e6df82d1fe4e9f5efcaa59330c: Status 404 returned error can't find the container with id 405a31216d6ff3ad5bc90d8b7dbc032a33ebb1e6df82d1fe4e9f5efcaa59330c Dec 06 05:58:32 crc kubenswrapper[4733]: I1206 05:58:32.335149 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerStarted","Data":"405a31216d6ff3ad5bc90d8b7dbc032a33ebb1e6df82d1fe4e9f5efcaa59330c"} Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.091112 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.306202 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.308368 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.310693 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.310736 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.311137 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.311275 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pzrn2" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.313055 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316776 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5t4\" (UniqueName: \"kubernetes.io/projected/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-kube-api-access-6z5t4\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316816 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316838 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316897 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316930 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-config\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316955 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-scripts\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316976 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.316993 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: E1206 05:58:33.317139 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 05:58:33 crc kubenswrapper[4733]: E1206 05:58:33.317157 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 05:58:33 crc kubenswrapper[4733]: E1206 05:58:33.317198 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift podName:a55915f4-28cf-4343-aefa-e6b145b3ccf1 nodeName:}" failed. No retries permitted until 2025-12-06 05:58:37.317185245 +0000 UTC m=+901.182396356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift") pod "swift-storage-0" (UID: "a55915f4-28cf-4343-aefa-e6b145b3ccf1") : configmap "swift-ring-files" not found Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.360832 4733 generic.go:334] "Generic (PLEG): container finished" podID="d6d5166f-3501-4698-8da3-499dad07de3b" containerID="11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c" exitCode=0 Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.360880 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerDied","Data":"11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c"} Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419044 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-config\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419127 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-scripts\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419171 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419202 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419436 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5t4\" (UniqueName: \"kubernetes.io/projected/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-kube-api-access-6z5t4\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419517 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.419553 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.421566 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.422624 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-config\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.424175 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-scripts\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.429083 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.429883 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.433267 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.438117 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5t4\" (UniqueName: \"kubernetes.io/projected/5ecd64ea-f2f0-4858-8e2e-de61f1d62d26-kube-api-access-6z5t4\") pod \"ovn-northd-0\" (UID: \"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26\") " pod="openstack/ovn-northd-0" Dec 06 05:58:33 crc kubenswrapper[4733]: I1206 05:58:33.629971 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 05:58:34 crc kubenswrapper[4733]: I1206 05:58:34.983687 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrhdv"] Dec 06 05:58:34 crc kubenswrapper[4733]: I1206 05:58:34.989923 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:34 crc kubenswrapper[4733]: I1206 05:58:34.995328 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrhdv"] Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.156556 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-utilities\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.156823 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cq7\" (UniqueName: \"kubernetes.io/projected/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-kube-api-access-k5cq7\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.156893 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-catalog-content\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.258139 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cq7\" (UniqueName: \"kubernetes.io/projected/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-kube-api-access-k5cq7\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.258479 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-catalog-content\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.258534 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-utilities\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.259044 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-catalog-content\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.259069 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-utilities\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.276243 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cq7\" (UniqueName: \"kubernetes.io/projected/a08a3e05-bf85-4e28-bbe1-9a9675b9efd9-kube-api-access-k5cq7\") pod \"community-operators-nrhdv\" (UID: \"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9\") " pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.351456 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.368093 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.381526 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s28f8" event={"ID":"1d720cd5-bb4e-449f-86a3-c9cff2acfada","Type":"ContainerStarted","Data":"a852f72e6ead0bb83b4c284aef97baada3aa6118c577cb3e8075c2fd1235f3f4"} Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.383905 4733 generic.go:334] "Generic (PLEG): container finished" podID="d6d5166f-3501-4698-8da3-499dad07de3b" containerID="8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61" exitCode=0 Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.383983 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerDied","Data":"8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61"} Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.385634 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26","Type":"ContainerStarted","Data":"0f20390318115c2f9839df88c628d1290309a0e7066e7766bfdfb65694562a18"} Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.403133 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s28f8" podStartSLOduration=1.269727008 podStartE2EDuration="5.403121438s" podCreationTimestamp="2025-12-06 05:58:30 +0000 UTC" firstStartedPulling="2025-12-06 05:58:30.759422346 +0000 UTC m=+894.624633456" lastFinishedPulling="2025-12-06 05:58:34.892816776 +0000 UTC m=+898.758027886" observedRunningTime="2025-12-06 05:58:35.400619923 +0000 UTC m=+899.265831034" watchObservedRunningTime="2025-12-06 05:58:35.403121438 +0000 UTC m=+899.268332548" Dec 06 05:58:35 crc kubenswrapper[4733]: I1206 05:58:35.788544 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrhdv"] Dec 06 05:58:35 crc kubenswrapper[4733]: W1206 05:58:35.795664 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08a3e05_bf85_4e28_bbe1_9a9675b9efd9.slice/crio-314bba3da68bc5941fc0d0e496cc27e2897d60c104cb0adec74d3071828ea623 WatchSource:0}: Error finding container 314bba3da68bc5941fc0d0e496cc27e2897d60c104cb0adec74d3071828ea623: Status 404 returned error can't find the container with id 314bba3da68bc5941fc0d0e496cc27e2897d60c104cb0adec74d3071828ea623 Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.399448 4733 generic.go:334] "Generic (PLEG): container finished" podID="a08a3e05-bf85-4e28-bbe1-9a9675b9efd9" containerID="46c2c319743b6aaf55df53c2fecda9d125aaf7c6958fe9d93ff5fed7a7809519" exitCode=0 Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.399571 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrhdv" event={"ID":"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9","Type":"ContainerDied","Data":"46c2c319743b6aaf55df53c2fecda9d125aaf7c6958fe9d93ff5fed7a7809519"} Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.399891 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrhdv" event={"ID":"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9","Type":"ContainerStarted","Data":"314bba3da68bc5941fc0d0e496cc27e2897d60c104cb0adec74d3071828ea623"} Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.404670 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerStarted","Data":"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df"} Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.441484 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5pfq" podStartSLOduration=3.039415783 podStartE2EDuration="5.441463274s" podCreationTimestamp="2025-12-06 05:58:31 +0000 UTC" firstStartedPulling="2025-12-06 05:58:33.474644041 +0000 UTC m=+897.339855153" lastFinishedPulling="2025-12-06 05:58:35.876691532 +0000 UTC m=+899.741902644" observedRunningTime="2025-12-06 05:58:36.438776251 +0000 UTC m=+900.303987362" watchObservedRunningTime="2025-12-06 05:58:36.441463274 +0000 UTC m=+900.306674385" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.526526 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-77pt9"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.528035 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.543045 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9d84-account-create-update-zfm9s"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.545807 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.549269 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.556713 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-77pt9"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.571833 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9d84-account-create-update-zfm9s"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.686890 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7vm\" (UniqueName: \"kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.686943 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.687273 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.687373 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjx6\" (UniqueName: \"kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.703171 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7xlbm"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.704275 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.716737 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xlbm"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.789034 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.789105 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjx6\" (UniqueName: \"kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.789212 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7vm\" (UniqueName: \"kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.789238 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.790004 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.790777 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.807899 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjx6\" (UniqueName: \"kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6\") pod \"keystone-db-create-77pt9\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.813481 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7vm\" (UniqueName: \"kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm\") pod \"keystone-9d84-account-create-update-zfm9s\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.815068 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-153e-account-create-update-lxx5v"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.816561 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.819133 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.822598 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-153e-account-create-update-lxx5v"] Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.861547 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.869220 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.891780 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4gh\" (UniqueName: \"kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.892245 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.993802 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkshd\" (UniqueName: \"kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.994079 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.994122 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.994142 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4gh\" (UniqueName: \"kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:36 crc kubenswrapper[4733]: I1206 05:58:36.994979 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.009128 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-g4cdr"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.010926 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4gh\" (UniqueName: \"kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh\") pod \"placement-db-create-7xlbm\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.011385 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.021033 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.022105 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g4cdr"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.096991 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkshd\" (UniqueName: \"kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.097151 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.097901 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.120843 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-cd13-account-create-update-g68t4"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.122133 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.125885 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.130120 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkshd\" (UniqueName: \"kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd\") pod \"placement-153e-account-create-update-lxx5v\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.170806 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.171085 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-cd13-account-create-update-g68t4"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.198651 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxbd\" (UniqueName: \"kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.199314 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.302752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.303006 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7f54\" (UniqueName: \"kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.303389 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxbd\" (UniqueName: \"kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.303457 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.304204 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.323866 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-77pt9"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.332871 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxbd\" (UniqueName: \"kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd\") pod \"glance-db-create-g4cdr\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.411128 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7f54\" (UniqueName: \"kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.411175 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.411286 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: E1206 05:58:37.411623 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 05:58:37 crc kubenswrapper[4733]: E1206 05:58:37.411683 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 05:58:37 crc kubenswrapper[4733]: E1206 05:58:37.411775 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift podName:a55915f4-28cf-4343-aefa-e6b145b3ccf1 nodeName:}" failed. No retries permitted until 2025-12-06 05:58:45.411754077 +0000 UTC m=+909.276965188 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift") pod "swift-storage-0" (UID: "a55915f4-28cf-4343-aefa-e6b145b3ccf1") : configmap "swift-ring-files" not found Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.412027 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.434363 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.435415 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7f54\" (UniqueName: \"kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54\") pod \"glance-cd13-account-create-update-g68t4\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.452391 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77pt9" event={"ID":"16b6231d-2829-46b8-b8cf-ce76dc4ad424","Type":"ContainerStarted","Data":"efa3f3b264644d348e6fb4c33ae6ec529b891d157f03074bac2e0aa6fc5456a2"} Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.463239 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26","Type":"ContainerStarted","Data":"bdfd174536dece0bd78b1908ac44a5d0e35ab2d14b453e3329f45ee7eb3d4325"} Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.463280 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5ecd64ea-f2f0-4858-8e2e-de61f1d62d26","Type":"ContainerStarted","Data":"482b9af0128baf329b957e7ef7541da7502dbb40e2fc72938753e3b1d545f69c"} Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.463459 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.466036 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.481636 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.86545277 podStartE2EDuration="4.481615776s" podCreationTimestamp="2025-12-06 05:58:33 +0000 UTC" firstStartedPulling="2025-12-06 05:58:35.356674278 +0000 UTC m=+899.221885389" lastFinishedPulling="2025-12-06 05:58:36.972837284 +0000 UTC m=+900.838048395" observedRunningTime="2025-12-06 05:58:37.477905579 +0000 UTC m=+901.343116690" watchObservedRunningTime="2025-12-06 05:58:37.481615776 +0000 UTC m=+901.346826886" Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.591954 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9d84-account-create-update-zfm9s"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.637581 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xlbm"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.715244 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-153e-account-create-update-lxx5v"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.888099 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g4cdr"] Dec 06 05:58:37 crc kubenswrapper[4733]: I1206 05:58:37.953635 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-cd13-account-create-update-g68t4"] Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.474491 4733 generic.go:334] "Generic (PLEG): container finished" podID="fd259bcc-2452-4b18-9019-072e545719bf" containerID="1a1d9a7f0deb240eeeddb3c50498ef2202f3e7eba15a8889da39b3f5f0e7caf1" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.474562 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d84-account-create-update-zfm9s" event={"ID":"fd259bcc-2452-4b18-9019-072e545719bf","Type":"ContainerDied","Data":"1a1d9a7f0deb240eeeddb3c50498ef2202f3e7eba15a8889da39b3f5f0e7caf1"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.474638 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d84-account-create-update-zfm9s" event={"ID":"fd259bcc-2452-4b18-9019-072e545719bf","Type":"ContainerStarted","Data":"f15c73ef5354296e09e4b11cb561ce854752445863e2f7c108f7ca788a9e6291"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.477467 4733 generic.go:334] "Generic (PLEG): container finished" podID="68d41a55-121b-462f-8c49-a014cbcd5cd5" containerID="3d6523b3c212b9ad17e44a8a34695fdc22830391265037fe6baf77256a0a27a3" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.477567 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4cdr" event={"ID":"68d41a55-121b-462f-8c49-a014cbcd5cd5","Type":"ContainerDied","Data":"3d6523b3c212b9ad17e44a8a34695fdc22830391265037fe6baf77256a0a27a3"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.477612 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4cdr" event={"ID":"68d41a55-121b-462f-8c49-a014cbcd5cd5","Type":"ContainerStarted","Data":"931e06ddd020d2528b5867f9c3bc8977519b537fe7f9f3ece51318a7247c4be8"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.479685 4733 generic.go:334] "Generic (PLEG): container finished" podID="91584c47-2491-4388-9ce4-e76d4ef92afd" containerID="0ed8e17045aa28eae3f7b4f5e8eb3378c0358f36a4330b15166204d65a506bb4" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.479758 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cd13-account-create-update-g68t4" event={"ID":"91584c47-2491-4388-9ce4-e76d4ef92afd","Type":"ContainerDied","Data":"0ed8e17045aa28eae3f7b4f5e8eb3378c0358f36a4330b15166204d65a506bb4"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.479777 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cd13-account-create-update-g68t4" event={"ID":"91584c47-2491-4388-9ce4-e76d4ef92afd","Type":"ContainerStarted","Data":"4301617f12d91c579b9062d1dc7e004f6a1f25c9b6167f2fe616052fc38448ea"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.481139 4733 generic.go:334] "Generic (PLEG): container finished" podID="16b6231d-2829-46b8-b8cf-ce76dc4ad424" containerID="8ce10c54a57e033ee52d5cd20070d37386e094dfadad2b56902a77110616ed37" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.481194 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77pt9" event={"ID":"16b6231d-2829-46b8-b8cf-ce76dc4ad424","Type":"ContainerDied","Data":"8ce10c54a57e033ee52d5cd20070d37386e094dfadad2b56902a77110616ed37"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.485888 4733 generic.go:334] "Generic (PLEG): container finished" podID="0701e468-3c5a-4812-af28-36d85baf6756" containerID="3631646b12d0e5b83aca92579f0c2e946bd2012199d2d213583dcfa1bd32f6c5" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.496084 4733 generic.go:334] "Generic (PLEG): container finished" podID="b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" containerID="876b8c19ec859d03cc4c3f857df5ac051f8f38f1f471c4bc1a45c894a6014705" exitCode=0 Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.509058 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-153e-account-create-update-lxx5v" event={"ID":"0701e468-3c5a-4812-af28-36d85baf6756","Type":"ContainerDied","Data":"3631646b12d0e5b83aca92579f0c2e946bd2012199d2d213583dcfa1bd32f6c5"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.509097 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-153e-account-create-update-lxx5v" event={"ID":"0701e468-3c5a-4812-af28-36d85baf6756","Type":"ContainerStarted","Data":"abff2ce61067e81f991e5a46d6c7f204a32dcb4f09f1de85e216a8f113346aee"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.509112 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xlbm" event={"ID":"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d","Type":"ContainerDied","Data":"876b8c19ec859d03cc4c3f857df5ac051f8f38f1f471c4bc1a45c894a6014705"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.509141 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xlbm" event={"ID":"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d","Type":"ContainerStarted","Data":"3b9aca5a48833864efaa83862c9f4bd497b0dcb0c36d0e2a4572823cf69cd1f9"} Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.762541 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.834175 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:38 crc kubenswrapper[4733]: I1206 05:58:38.838543 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="dnsmasq-dns" containerID="cri-o://584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2" gracePeriod=10 Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.290376 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.356449 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf9t6\" (UniqueName: \"kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6\") pod \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.357377 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb\") pod \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.357537 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc\") pod \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.357564 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config\") pod \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.357588 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb\") pod \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\" (UID: \"715d93bf-4fc7-4bc5-adb5-4504c9c954ea\") " Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.379125 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6" (OuterVolumeSpecName: "kube-api-access-lf9t6") pod "715d93bf-4fc7-4bc5-adb5-4504c9c954ea" (UID: "715d93bf-4fc7-4bc5-adb5-4504c9c954ea"). InnerVolumeSpecName "kube-api-access-lf9t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.389790 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "715d93bf-4fc7-4bc5-adb5-4504c9c954ea" (UID: "715d93bf-4fc7-4bc5-adb5-4504c9c954ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.398778 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "715d93bf-4fc7-4bc5-adb5-4504c9c954ea" (UID: "715d93bf-4fc7-4bc5-adb5-4504c9c954ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.405601 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "715d93bf-4fc7-4bc5-adb5-4504c9c954ea" (UID: "715d93bf-4fc7-4bc5-adb5-4504c9c954ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.406270 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config" (OuterVolumeSpecName: "config") pod "715d93bf-4fc7-4bc5-adb5-4504c9c954ea" (UID: "715d93bf-4fc7-4bc5-adb5-4504c9c954ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.459235 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.459267 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.459280 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.459294 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf9t6\" (UniqueName: \"kubernetes.io/projected/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-kube-api-access-lf9t6\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.459363 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/715d93bf-4fc7-4bc5-adb5-4504c9c954ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.508030 4733 generic.go:334] "Generic (PLEG): container finished" podID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerID="584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2" exitCode=0 Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.508203 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.509339 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" event={"ID":"715d93bf-4fc7-4bc5-adb5-4504c9c954ea","Type":"ContainerDied","Data":"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2"} Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.509412 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d59ccb8c-62d2b" event={"ID":"715d93bf-4fc7-4bc5-adb5-4504c9c954ea","Type":"ContainerDied","Data":"2473ad330c062a4cc840408168ba60d2f00daa78847d9d158c92924e0a6b49d2"} Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.509446 4733 scope.go:117] "RemoveContainer" containerID="584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.529755 4733 scope.go:117] "RemoveContainer" containerID="53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.552290 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.558158 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78d59ccb8c-62d2b"] Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.580381 4733 scope.go:117] "RemoveContainer" containerID="584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2" Dec 06 05:58:39 crc kubenswrapper[4733]: E1206 05:58:39.580726 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2\": container with ID starting with 584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2 not found: ID does not exist" containerID="584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.580776 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2"} err="failed to get container status \"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2\": rpc error: code = NotFound desc = could not find container \"584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2\": container with ID starting with 584b18d9c567fe1af4e3ee60da7d09566da925988b32edb712fc81cac55df6a2 not found: ID does not exist" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.580796 4733 scope.go:117] "RemoveContainer" containerID="53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f" Dec 06 05:58:39 crc kubenswrapper[4733]: E1206 05:58:39.581022 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f\": container with ID starting with 53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f not found: ID does not exist" containerID="53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f" Dec 06 05:58:39 crc kubenswrapper[4733]: I1206 05:58:39.581043 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f"} err="failed to get container status \"53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f\": rpc error: code = NotFound desc = could not find container \"53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f\": container with ID starting with 53a209c10286b5bce883bc68720942acad435aab65cafd65765d491cf0372a0f not found: ID does not exist" Dec 06 05:58:40 crc kubenswrapper[4733]: I1206 05:58:40.496702 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" path="/var/lib/kubelet/pods/715d93bf-4fc7-4bc5-adb5-4504c9c954ea/volumes" Dec 06 05:58:41 crc kubenswrapper[4733]: I1206 05:58:41.723995 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:41 crc kubenswrapper[4733]: I1206 05:58:41.724061 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:41 crc kubenswrapper[4733]: I1206 05:58:41.761380 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.543081 4733 generic.go:334] "Generic (PLEG): container finished" podID="1d720cd5-bb4e-449f-86a3-c9cff2acfada" containerID="a852f72e6ead0bb83b4c284aef97baada3aa6118c577cb3e8075c2fd1235f3f4" exitCode=0 Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.543143 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s28f8" event={"ID":"1d720cd5-bb4e-449f-86a3-c9cff2acfada","Type":"ContainerDied","Data":"a852f72e6ead0bb83b4c284aef97baada3aa6118c577cb3e8075c2fd1235f3f4"} Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.579534 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.707548 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.724887 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts\") pod \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.725152 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm4gh\" (UniqueName: \"kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh\") pod \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\" (UID: \"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.726138 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" (UID: "b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.736546 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.737489 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh" (OuterVolumeSpecName: "kube-api-access-sm4gh") pod "b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" (UID: "b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d"). InnerVolumeSpecName "kube-api-access-sm4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.793084 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.797334 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.804269 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.809560 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.827035 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.827182 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm4gh\" (UniqueName: \"kubernetes.io/projected/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d-kube-api-access-sm4gh\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928463 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts\") pod \"91584c47-2491-4388-9ce4-e76d4ef92afd\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928638 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts\") pod \"0701e468-3c5a-4812-af28-36d85baf6756\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928766 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkshd\" (UniqueName: \"kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd\") pod \"0701e468-3c5a-4812-af28-36d85baf6756\" (UID: \"0701e468-3c5a-4812-af28-36d85baf6756\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928802 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkjx6\" (UniqueName: \"kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6\") pod \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928835 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7f54\" (UniqueName: \"kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54\") pod \"91584c47-2491-4388-9ce4-e76d4ef92afd\" (UID: \"91584c47-2491-4388-9ce4-e76d4ef92afd\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928856 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts\") pod \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\" (UID: \"16b6231d-2829-46b8-b8cf-ce76dc4ad424\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928879 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws7vm\" (UniqueName: \"kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm\") pod \"fd259bcc-2452-4b18-9019-072e545719bf\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928916 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts\") pod \"68d41a55-121b-462f-8c49-a014cbcd5cd5\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928949 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts\") pod \"fd259bcc-2452-4b18-9019-072e545719bf\" (UID: \"fd259bcc-2452-4b18-9019-072e545719bf\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.928979 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fxbd\" (UniqueName: \"kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd\") pod \"68d41a55-121b-462f-8c49-a014cbcd5cd5\" (UID: \"68d41a55-121b-462f-8c49-a014cbcd5cd5\") " Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.929358 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91584c47-2491-4388-9ce4-e76d4ef92afd" (UID: "91584c47-2491-4388-9ce4-e76d4ef92afd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.929473 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91584c47-2491-4388-9ce4-e76d4ef92afd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.929604 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0701e468-3c5a-4812-af28-36d85baf6756" (UID: "0701e468-3c5a-4812-af28-36d85baf6756"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.929742 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16b6231d-2829-46b8-b8cf-ce76dc4ad424" (UID: "16b6231d-2829-46b8-b8cf-ce76dc4ad424"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.929824 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68d41a55-121b-462f-8c49-a014cbcd5cd5" (UID: "68d41a55-121b-462f-8c49-a014cbcd5cd5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.930194 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd259bcc-2452-4b18-9019-072e545719bf" (UID: "fd259bcc-2452-4b18-9019-072e545719bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.933601 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6" (OuterVolumeSpecName: "kube-api-access-jkjx6") pod "16b6231d-2829-46b8-b8cf-ce76dc4ad424" (UID: "16b6231d-2829-46b8-b8cf-ce76dc4ad424"). InnerVolumeSpecName "kube-api-access-jkjx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.933627 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54" (OuterVolumeSpecName: "kube-api-access-v7f54") pod "91584c47-2491-4388-9ce4-e76d4ef92afd" (UID: "91584c47-2491-4388-9ce4-e76d4ef92afd"). InnerVolumeSpecName "kube-api-access-v7f54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.934144 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd" (OuterVolumeSpecName: "kube-api-access-hkshd") pod "0701e468-3c5a-4812-af28-36d85baf6756" (UID: "0701e468-3c5a-4812-af28-36d85baf6756"). InnerVolumeSpecName "kube-api-access-hkshd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.940479 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm" (OuterVolumeSpecName: "kube-api-access-ws7vm") pod "fd259bcc-2452-4b18-9019-072e545719bf" (UID: "fd259bcc-2452-4b18-9019-072e545719bf"). InnerVolumeSpecName "kube-api-access-ws7vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.942471 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd" (OuterVolumeSpecName: "kube-api-access-8fxbd") pod "68d41a55-121b-462f-8c49-a014cbcd5cd5" (UID: "68d41a55-121b-462f-8c49-a014cbcd5cd5"). InnerVolumeSpecName "kube-api-access-8fxbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:42 crc kubenswrapper[4733]: I1206 05:58:42.958440 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031065 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkshd\" (UniqueName: \"kubernetes.io/projected/0701e468-3c5a-4812-af28-36d85baf6756-kube-api-access-hkshd\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031442 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkjx6\" (UniqueName: \"kubernetes.io/projected/16b6231d-2829-46b8-b8cf-ce76dc4ad424-kube-api-access-jkjx6\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031502 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7f54\" (UniqueName: \"kubernetes.io/projected/91584c47-2491-4388-9ce4-e76d4ef92afd-kube-api-access-v7f54\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031560 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16b6231d-2829-46b8-b8cf-ce76dc4ad424-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031612 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws7vm\" (UniqueName: \"kubernetes.io/projected/fd259bcc-2452-4b18-9019-072e545719bf-kube-api-access-ws7vm\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031662 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d41a55-121b-462f-8c49-a014cbcd5cd5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031723 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd259bcc-2452-4b18-9019-072e545719bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031786 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fxbd\" (UniqueName: \"kubernetes.io/projected/68d41a55-121b-462f-8c49-a014cbcd5cd5-kube-api-access-8fxbd\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.031836 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0701e468-3c5a-4812-af28-36d85baf6756-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.555442 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cd13-account-create-update-g68t4" event={"ID":"91584c47-2491-4388-9ce4-e76d4ef92afd","Type":"ContainerDied","Data":"4301617f12d91c579b9062d1dc7e004f6a1f25c9b6167f2fe616052fc38448ea"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.555483 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4301617f12d91c579b9062d1dc7e004f6a1f25c9b6167f2fe616052fc38448ea" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.555462 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cd13-account-create-update-g68t4" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.557662 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77pt9" event={"ID":"16b6231d-2829-46b8-b8cf-ce76dc4ad424","Type":"ContainerDied","Data":"efa3f3b264644d348e6fb4c33ae6ec529b891d157f03074bac2e0aa6fc5456a2"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.557695 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efa3f3b264644d348e6fb4c33ae6ec529b891d157f03074bac2e0aa6fc5456a2" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.557765 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77pt9" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.560396 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-153e-account-create-update-lxx5v" event={"ID":"0701e468-3c5a-4812-af28-36d85baf6756","Type":"ContainerDied","Data":"abff2ce61067e81f991e5a46d6c7f204a32dcb4f09f1de85e216a8f113346aee"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.560436 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abff2ce61067e81f991e5a46d6c7f204a32dcb4f09f1de85e216a8f113346aee" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.560475 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-153e-account-create-update-lxx5v" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.563704 4733 generic.go:334] "Generic (PLEG): container finished" podID="a08a3e05-bf85-4e28-bbe1-9a9675b9efd9" containerID="0dc407f3dc964b48d37c5d8263b8c2c9e184820cf73159919d3f31eb654579d7" exitCode=0 Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.563771 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrhdv" event={"ID":"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9","Type":"ContainerDied","Data":"0dc407f3dc964b48d37c5d8263b8c2c9e184820cf73159919d3f31eb654579d7"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.565700 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xlbm" event={"ID":"b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d","Type":"ContainerDied","Data":"3b9aca5a48833864efaa83862c9f4bd497b0dcb0c36d0e2a4572823cf69cd1f9"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.565743 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9aca5a48833864efaa83862c9f4bd497b0dcb0c36d0e2a4572823cf69cd1f9" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.565808 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xlbm" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.578810 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d84-account-create-update-zfm9s" event={"ID":"fd259bcc-2452-4b18-9019-072e545719bf","Type":"ContainerDied","Data":"f15c73ef5354296e09e4b11cb561ce854752445863e2f7c108f7ca788a9e6291"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.578848 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f15c73ef5354296e09e4b11cb561ce854752445863e2f7c108f7ca788a9e6291" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.578902 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d84-account-create-update-zfm9s" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.586329 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4cdr" event={"ID":"68d41a55-121b-462f-8c49-a014cbcd5cd5","Type":"ContainerDied","Data":"931e06ddd020d2528b5867f9c3bc8977519b537fe7f9f3ece51318a7247c4be8"} Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.586357 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="931e06ddd020d2528b5867f9c3bc8977519b537fe7f9f3ece51318a7247c4be8" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.586373 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4cdr" Dec 06 05:58:43 crc kubenswrapper[4733]: I1206 05:58:43.871842 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.048112 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.048216 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.048324 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.048463 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.048554 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.049556 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.049550 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.049694 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices\") pod \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\" (UID: \"1d720cd5-bb4e-449f-86a3-c9cff2acfada\") " Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.050106 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.051058 4733 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.051088 4733 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d720cd5-bb4e-449f-86a3-c9cff2acfada-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.054362 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t" (OuterVolumeSpecName: "kube-api-access-rbf6t") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "kube-api-access-rbf6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.059702 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.069380 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts" (OuterVolumeSpecName: "scripts") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.071623 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.072811 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d720cd5-bb4e-449f-86a3-c9cff2acfada" (UID: "1d720cd5-bb4e-449f-86a3-c9cff2acfada"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.154349 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d720cd5-bb4e-449f-86a3-c9cff2acfada-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.154407 4733 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.154427 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/1d720cd5-bb4e-449f-86a3-c9cff2acfada-kube-api-access-rbf6t\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.154437 4733 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.154449 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d720cd5-bb4e-449f-86a3-c9cff2acfada-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.599485 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrhdv" event={"ID":"a08a3e05-bf85-4e28-bbe1-9a9675b9efd9","Type":"ContainerStarted","Data":"80be68972eaf6baf773bdb705dcb5e3783a2358bad3a8c4fc392d7a8f1c2cb5d"} Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.601499 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s28f8" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.601555 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s28f8" event={"ID":"1d720cd5-bb4e-449f-86a3-c9cff2acfada","Type":"ContainerDied","Data":"398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8"} Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.601593 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="398fe7eead7ecc2480018daf0e80c38440c21a9ee5e429ebdc725f1b6a80b2b8" Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.601693 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5pfq" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="registry-server" containerID="cri-o://bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df" gracePeriod=2 Dec 06 05:58:44 crc kubenswrapper[4733]: I1206 05:58:44.622688 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrhdv" podStartSLOduration=2.967510207 podStartE2EDuration="10.622674513s" podCreationTimestamp="2025-12-06 05:58:34 +0000 UTC" firstStartedPulling="2025-12-06 05:58:36.401574968 +0000 UTC m=+900.266786079" lastFinishedPulling="2025-12-06 05:58:44.056739274 +0000 UTC m=+907.921950385" observedRunningTime="2025-12-06 05:58:44.616513628 +0000 UTC m=+908.481724740" watchObservedRunningTime="2025-12-06 05:58:44.622674513 +0000 UTC m=+908.487885624" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.027510 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.177241 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content\") pod \"d6d5166f-3501-4698-8da3-499dad07de3b\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.177645 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5p88\" (UniqueName: \"kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88\") pod \"d6d5166f-3501-4698-8da3-499dad07de3b\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.177698 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities\") pod \"d6d5166f-3501-4698-8da3-499dad07de3b\" (UID: \"d6d5166f-3501-4698-8da3-499dad07de3b\") " Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.178510 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities" (OuterVolumeSpecName: "utilities") pod "d6d5166f-3501-4698-8da3-499dad07de3b" (UID: "d6d5166f-3501-4698-8da3-499dad07de3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.198463 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88" (OuterVolumeSpecName: "kube-api-access-j5p88") pod "d6d5166f-3501-4698-8da3-499dad07de3b" (UID: "d6d5166f-3501-4698-8da3-499dad07de3b"). InnerVolumeSpecName "kube-api-access-j5p88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.201879 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6d5166f-3501-4698-8da3-499dad07de3b" (UID: "d6d5166f-3501-4698-8da3-499dad07de3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.280434 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.280476 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5p88\" (UniqueName: \"kubernetes.io/projected/d6d5166f-3501-4698-8da3-499dad07de3b-kube-api-access-j5p88\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.280488 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6d5166f-3501-4698-8da3-499dad07de3b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.368480 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.368530 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.483789 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.489988 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a55915f4-28cf-4343-aefa-e6b145b3ccf1-etc-swift\") pod \"swift-storage-0\" (UID: \"a55915f4-28cf-4343-aefa-e6b145b3ccf1\") " pod="openstack/swift-storage-0" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.610423 4733 generic.go:334] "Generic (PLEG): container finished" podID="d6d5166f-3501-4698-8da3-499dad07de3b" containerID="bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df" exitCode=0 Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.610518 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerDied","Data":"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df"} Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.610572 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5pfq" event={"ID":"d6d5166f-3501-4698-8da3-499dad07de3b","Type":"ContainerDied","Data":"405a31216d6ff3ad5bc90d8b7dbc032a33ebb1e6df82d1fe4e9f5efcaa59330c"} Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.610592 4733 scope.go:117] "RemoveContainer" containerID="bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.610853 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5pfq" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.647062 4733 scope.go:117] "RemoveContainer" containerID="8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.653503 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.660542 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5pfq"] Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.687258 4733 scope.go:117] "RemoveContainer" containerID="11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.701422 4733 scope.go:117] "RemoveContainer" containerID="bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df" Dec 06 05:58:45 crc kubenswrapper[4733]: E1206 05:58:45.701816 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df\": container with ID starting with bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df not found: ID does not exist" containerID="bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.701855 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df"} err="failed to get container status \"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df\": rpc error: code = NotFound desc = could not find container \"bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df\": container with ID starting with bc638ab8c7a3727ebde19447a2df8e8382a911997b61db1490c078941f1405df not found: ID does not exist" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.701881 4733 scope.go:117] "RemoveContainer" containerID="8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61" Dec 06 05:58:45 crc kubenswrapper[4733]: E1206 05:58:45.702283 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61\": container with ID starting with 8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61 not found: ID does not exist" containerID="8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.702341 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61"} err="failed to get container status \"8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61\": rpc error: code = NotFound desc = could not find container \"8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61\": container with ID starting with 8a881ff0c509b409332e18805e1419abe48952cc15d34c4d51a91b8870fe8f61 not found: ID does not exist" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.702367 4733 scope.go:117] "RemoveContainer" containerID="11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c" Dec 06 05:58:45 crc kubenswrapper[4733]: E1206 05:58:45.702638 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c\": container with ID starting with 11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c not found: ID does not exist" containerID="11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.702661 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c"} err="failed to get container status \"11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c\": rpc error: code = NotFound desc = could not find container \"11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c\": container with ID starting with 11dfd5c6b59fee33b84322bfe7efe7dc366e86ddc61994ade296c8177f08be3c not found: ID does not exist" Dec 06 05:58:45 crc kubenswrapper[4733]: I1206 05:58:45.739997 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 05:58:46 crc kubenswrapper[4733]: I1206 05:58:46.201639 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 05:58:46 crc kubenswrapper[4733]: W1206 05:58:46.205360 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda55915f4_28cf_4343_aefa_e6b145b3ccf1.slice/crio-fd14a47e90418bc00818b94b7a13adfcc48b33b1a30d13577ddbcdf2a8f7f1ec WatchSource:0}: Error finding container fd14a47e90418bc00818b94b7a13adfcc48b33b1a30d13577ddbcdf2a8f7f1ec: Status 404 returned error can't find the container with id fd14a47e90418bc00818b94b7a13adfcc48b33b1a30d13577ddbcdf2a8f7f1ec Dec 06 05:58:46 crc kubenswrapper[4733]: I1206 05:58:46.413376 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nrhdv" podUID="a08a3e05-bf85-4e28-bbe1-9a9675b9efd9" containerName="registry-server" probeResult="failure" output=< Dec 06 05:58:46 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Dec 06 05:58:46 crc kubenswrapper[4733]: > Dec 06 05:58:46 crc kubenswrapper[4733]: I1206 05:58:46.499772 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" path="/var/lib/kubelet/pods/d6d5166f-3501-4698-8da3-499dad07de3b/volumes" Dec 06 05:58:46 crc kubenswrapper[4733]: I1206 05:58:46.622674 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"fd14a47e90418bc00818b94b7a13adfcc48b33b1a30d13577ddbcdf2a8f7f1ec"} Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338079 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zz9tl"] Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338511 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338531 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338546 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b6231d-2829-46b8-b8cf-ce76dc4ad424" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338553 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b6231d-2829-46b8-b8cf-ce76dc4ad424" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338563 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0701e468-3c5a-4812-af28-36d85baf6756" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338571 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0701e468-3c5a-4812-af28-36d85baf6756" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338580 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="dnsmasq-dns" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338586 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="dnsmasq-dns" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338596 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd259bcc-2452-4b18-9019-072e545719bf" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338602 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd259bcc-2452-4b18-9019-072e545719bf" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338613 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d41a55-121b-462f-8c49-a014cbcd5cd5" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338619 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d41a55-121b-462f-8c49-a014cbcd5cd5" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338630 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91584c47-2491-4388-9ce4-e76d4ef92afd" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338635 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="91584c47-2491-4388-9ce4-e76d4ef92afd" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338645 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d720cd5-bb4e-449f-86a3-c9cff2acfada" containerName="swift-ring-rebalance" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338651 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d720cd5-bb4e-449f-86a3-c9cff2acfada" containerName="swift-ring-rebalance" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338664 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="extract-utilities" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338670 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="extract-utilities" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338682 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="extract-content" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338688 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="extract-content" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338694 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="init" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338700 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="init" Dec 06 05:58:47 crc kubenswrapper[4733]: E1206 05:58:47.338710 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="registry-server" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338715 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="registry-server" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338924 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="91584c47-2491-4388-9ce4-e76d4ef92afd" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338937 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b6231d-2829-46b8-b8cf-ce76dc4ad424" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338951 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="715d93bf-4fc7-4bc5-adb5-4504c9c954ea" containerName="dnsmasq-dns" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338978 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd259bcc-2452-4b18-9019-072e545719bf" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.338989 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0701e468-3c5a-4812-af28-36d85baf6756" containerName="mariadb-account-create-update" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.339001 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d41a55-121b-462f-8c49-a014cbcd5cd5" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.339013 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d720cd5-bb4e-449f-86a3-c9cff2acfada" containerName="swift-ring-rebalance" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.339021 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" containerName="mariadb-database-create" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.339033 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d5166f-3501-4698-8da3-499dad07de3b" containerName="registry-server" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.339667 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.342232 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t2xs9" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.342975 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.348429 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zz9tl"] Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.422194 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.422265 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.422297 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.422541 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vt7t\" (UniqueName: \"kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.524009 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vt7t\" (UniqueName: \"kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.524553 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.524599 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.524622 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.531323 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.531443 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.531511 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.539479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vt7t\" (UniqueName: \"kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t\") pod \"glance-db-sync-zz9tl\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:47 crc kubenswrapper[4733]: I1206 05:58:47.657771 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zz9tl" Dec 06 05:58:48 crc kubenswrapper[4733]: I1206 05:58:48.159839 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zz9tl"] Dec 06 05:58:48 crc kubenswrapper[4733]: W1206 05:58:48.177445 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice/crio-7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af WatchSource:0}: Error finding container 7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af: Status 404 returned error can't find the container with id 7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af Dec 06 05:58:48 crc kubenswrapper[4733]: I1206 05:58:48.641995 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zz9tl" event={"ID":"bc8e93b6-7230-41f1-98f5-18b252d0d724","Type":"ContainerStarted","Data":"7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af"} Dec 06 05:58:48 crc kubenswrapper[4733]: I1206 05:58:48.683319 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 05:58:51 crc kubenswrapper[4733]: I1206 05:58:51.697324 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"24d0c991aaa74ba519e927d3912959467a80a5dbe5be35c08fe8b43056c33ca3"} Dec 06 05:58:51 crc kubenswrapper[4733]: I1206 05:58:51.697667 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"a19bb1c294925c0dcba79c6957c69385fb909fece9dde29cbd26d5d06f1dc35c"} Dec 06 05:58:51 crc kubenswrapper[4733]: I1206 05:58:51.697681 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"8dbff021a613ab5a58be6f13421c720535c4ad7d01a020ff48a683f096c12b4a"} Dec 06 05:58:51 crc kubenswrapper[4733]: I1206 05:58:51.697691 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"08bf476d80591ced133a5cafcf89f8e9a04ce9f8317209dac0e47b1522354bf0"} Dec 06 05:58:52 crc kubenswrapper[4733]: I1206 05:58:52.707998 4733 generic.go:334] "Generic (PLEG): container finished" podID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerID="3414692aed66b4eeb2d86e147784525b258dc75c775f3e1178bcdad27a734b53" exitCode=0 Dec 06 05:58:52 crc kubenswrapper[4733]: I1206 05:58:52.708087 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerDied","Data":"3414692aed66b4eeb2d86e147784525b258dc75c775f3e1178bcdad27a734b53"} Dec 06 05:58:52 crc kubenswrapper[4733]: I1206 05:58:52.711043 4733 generic.go:334] "Generic (PLEG): container finished" podID="ba773bb2-77c5-4562-b8ba-53428904d503" containerID="28d656946022d0e45d8ae7cd9d5210bbeea6770c3efa37b73f252df6528fed96" exitCode=0 Dec 06 05:58:52 crc kubenswrapper[4733]: I1206 05:58:52.711075 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerDied","Data":"28d656946022d0e45d8ae7cd9d5210bbeea6770c3efa37b73f252df6528fed96"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.721846 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerStarted","Data":"ac133d55b98ee632f71e0c95233b98dd1f467f581b77c094eb880da4e03f23bd"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.722488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.739865 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"354fc5e7a9daaa208dbd7ac96f2c1d40670d61e43f98eb11cc73b6c5b9db8871"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.739912 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"89e74fce68553a2b2e09bf30332d3e04a7d146e1bcf1f441ed5b7db2ca9e3af2"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.739923 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"f0092b2acf3753081201c8184ff212929fc94d92f708e4d16b162ec4e6910a55"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.739933 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"74703fbf2776c515f9542355451c331fb95090173291fa48cecb128a801d9eb8"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.742457 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerStarted","Data":"65f01921231ba7be95baeb61675f464398c91e40ec04d220476a54cbb32b8a55"} Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.742694 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.761141 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.508157918 podStartE2EDuration="1m1.761130854s" podCreationTimestamp="2025-12-06 05:57:52 +0000 UTC" firstStartedPulling="2025-12-06 05:57:53.844375286 +0000 UTC m=+857.709586397" lastFinishedPulling="2025-12-06 05:58:19.097348221 +0000 UTC m=+882.962559333" observedRunningTime="2025-12-06 05:58:53.756181658 +0000 UTC m=+917.621392770" watchObservedRunningTime="2025-12-06 05:58:53.761130854 +0000 UTC m=+917.626341966" Dec 06 05:58:53 crc kubenswrapper[4733]: I1206 05:58:53.779520 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.067316975 podStartE2EDuration="1m1.779510776s" podCreationTimestamp="2025-12-06 05:57:52 +0000 UTC" firstStartedPulling="2025-12-06 05:57:54.417366955 +0000 UTC m=+858.282578065" lastFinishedPulling="2025-12-06 05:58:19.129560755 +0000 UTC m=+882.994771866" observedRunningTime="2025-12-06 05:58:53.773957944 +0000 UTC m=+917.639169056" watchObservedRunningTime="2025-12-06 05:58:53.779510776 +0000 UTC m=+917.644721887" Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.407940 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.447509 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrhdv" Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.580950 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrhdv"] Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.660173 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.664772 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6czlb" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="registry-server" containerID="cri-o://8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56" gracePeriod=2 Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.790462 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"8c25fb29244bbbd25759eefd34f30b0bf9b8fe4b4533f8adb04f5996602852bc"} Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.790497 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"1e33d186a53a14097f3000f7b3caf7c81e53403c7740d42c72b2dd432df1129a"} Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.790506 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"44697d2444959d8de8086abbef1bf8efd6f5e7879d108f9111c67149880336c3"} Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.790514 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"b39f4a235e3a4a62d3d3a78f264174643d97e897690abcf56f3f37aa382b47be"} Dec 06 05:58:55 crc kubenswrapper[4733]: I1206 05:58:55.790522 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"db289920e1feb6532d973f3fdfb946880367210aecc50510a3e027d1a010f2b8"} Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.184559 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.287663 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities\") pod \"a462e608-8ce9-449b-897d-c5fd47649a86\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.287797 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content\") pod \"a462e608-8ce9-449b-897d-c5fd47649a86\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.287995 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp4zx\" (UniqueName: \"kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx\") pod \"a462e608-8ce9-449b-897d-c5fd47649a86\" (UID: \"a462e608-8ce9-449b-897d-c5fd47649a86\") " Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.288456 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities" (OuterVolumeSpecName: "utilities") pod "a462e608-8ce9-449b-897d-c5fd47649a86" (UID: "a462e608-8ce9-449b-897d-c5fd47649a86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.296821 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx" (OuterVolumeSpecName: "kube-api-access-fp4zx") pod "a462e608-8ce9-449b-897d-c5fd47649a86" (UID: "a462e608-8ce9-449b-897d-c5fd47649a86"). InnerVolumeSpecName "kube-api-access-fp4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.334480 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a462e608-8ce9-449b-897d-c5fd47649a86" (UID: "a462e608-8ce9-449b-897d-c5fd47649a86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.390458 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp4zx\" (UniqueName: \"kubernetes.io/projected/a462e608-8ce9-449b-897d-c5fd47649a86-kube-api-access-fp4zx\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.390490 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.390499 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a462e608-8ce9-449b-897d-c5fd47649a86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.799546 4733 generic.go:334] "Generic (PLEG): container finished" podID="a462e608-8ce9-449b-897d-c5fd47649a86" containerID="8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56" exitCode=0 Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.799626 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerDied","Data":"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56"} Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.799655 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6czlb" event={"ID":"a462e608-8ce9-449b-897d-c5fd47649a86","Type":"ContainerDied","Data":"3bd098c8d7b254c6091d19466b58e2089913be7eec19436b186b52421164c0b5"} Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.799671 4733 scope.go:117] "RemoveContainer" containerID="8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.799792 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6czlb" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.809614 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"40af1ca15bd852ff613d508590c69afaf4ac69ece81e838b5f2d4357a736d82b"} Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.809641 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a55915f4-28cf-4343-aefa-e6b145b3ccf1","Type":"ContainerStarted","Data":"8aab98501a20c4588fa01f043db07791c9a83db902efde309848727ad05a91bb"} Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.823060 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.824962 4733 scope.go:117] "RemoveContainer" containerID="ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.831864 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6czlb"] Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.841485 4733 scope.go:117] "RemoveContainer" containerID="02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.864537 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.371120953 podStartE2EDuration="28.864517609s" podCreationTimestamp="2025-12-06 05:58:28 +0000 UTC" firstStartedPulling="2025-12-06 05:58:46.20704644 +0000 UTC m=+910.072257551" lastFinishedPulling="2025-12-06 05:58:54.700443086 +0000 UTC m=+918.565654207" observedRunningTime="2025-12-06 05:58:56.853894844 +0000 UTC m=+920.719105955" watchObservedRunningTime="2025-12-06 05:58:56.864517609 +0000 UTC m=+920.729728720" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.886486 4733 scope.go:117] "RemoveContainer" containerID="8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56" Dec 06 05:58:56 crc kubenswrapper[4733]: E1206 05:58:56.886869 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56\": container with ID starting with 8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56 not found: ID does not exist" containerID="8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.886898 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56"} err="failed to get container status \"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56\": rpc error: code = NotFound desc = could not find container \"8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56\": container with ID starting with 8e0e2cc0da107b1933851219df4412bddc50b4e5e64e5f668512181c5b619a56 not found: ID does not exist" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.886917 4733 scope.go:117] "RemoveContainer" containerID="ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f" Dec 06 05:58:56 crc kubenswrapper[4733]: E1206 05:58:56.887250 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f\": container with ID starting with ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f not found: ID does not exist" containerID="ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.887316 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f"} err="failed to get container status \"ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f\": rpc error: code = NotFound desc = could not find container \"ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f\": container with ID starting with ead42e030a979177c423b11cd4cd50e23b704e284d7021309cb491f53d91de5f not found: ID does not exist" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.887342 4733 scope.go:117] "RemoveContainer" containerID="02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0" Dec 06 05:58:56 crc kubenswrapper[4733]: E1206 05:58:56.887746 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0\": container with ID starting with 02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0 not found: ID does not exist" containerID="02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0" Dec 06 05:58:56 crc kubenswrapper[4733]: I1206 05:58:56.887786 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0"} err="failed to get container status \"02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0\": rpc error: code = NotFound desc = could not find container \"02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0\": container with ID starting with 02d92cf6735c21e661e4bbf7c1630e92cfc904cc2aec2af7531179a6a717d3c0 not found: ID does not exist" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.150259 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:58:57 crc kubenswrapper[4733]: E1206 05:58:57.150825 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="registry-server" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.150845 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="registry-server" Dec 06 05:58:57 crc kubenswrapper[4733]: E1206 05:58:57.150860 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="extract-content" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.150867 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="extract-content" Dec 06 05:58:57 crc kubenswrapper[4733]: E1206 05:58:57.150884 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="extract-utilities" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.150890 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="extract-utilities" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.151041 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" containerName="registry-server" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.151871 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.155603 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.233711 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.241365 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2ztw7" podUID="5589595d-741e-424a-955a-6fc8b83c18c1" containerName="ovn-controller" probeResult="failure" output=< Dec 06 05:58:57 crc kubenswrapper[4733]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 05:58:57 crc kubenswrapper[4733]: > Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.277531 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.288324 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4wzzg" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308249 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308365 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308482 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308599 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgd7\" (UniqueName: \"kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308616 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.308652 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410131 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410236 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgd7\" (UniqueName: \"kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410256 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410296 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410353 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.410384 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.411286 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.411328 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.411353 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.411652 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.411760 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.428959 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgd7\" (UniqueName: \"kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7\") pod \"dnsmasq-dns-8c486c6ff-fq7t7\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.476368 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.500503 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2ztw7-config-f97vt"] Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.501775 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.507990 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.526436 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7-config-f97vt"] Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624571 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624635 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m24d\" (UniqueName: \"kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624746 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624771 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624801 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.624831 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729641 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729693 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729754 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729802 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.729831 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m24d\" (UniqueName: \"kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.730626 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.730650 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.730707 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.731258 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.732430 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.747374 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m24d\" (UniqueName: \"kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d\") pod \"ovn-controller-2ztw7-config-f97vt\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.859152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:58:57 crc kubenswrapper[4733]: I1206 05:58:57.965962 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:58:58 crc kubenswrapper[4733]: W1206 05:58:58.016540 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fff7d32_f796_45d1_a13b_1b2286e593c3.slice/crio-a96735a2a975970b6e1a1c1e61e0305541aaa1419d360ba5031f88c5e2d16585 WatchSource:0}: Error finding container a96735a2a975970b6e1a1c1e61e0305541aaa1419d360ba5031f88c5e2d16585: Status 404 returned error can't find the container with id a96735a2a975970b6e1a1c1e61e0305541aaa1419d360ba5031f88c5e2d16585 Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.281532 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7-config-f97vt"] Dec 06 05:58:58 crc kubenswrapper[4733]: W1206 05:58:58.291469 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4f98249_4c60_4cce_bfbd_e517303feee5.slice/crio-21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d WatchSource:0}: Error finding container 21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d: Status 404 returned error can't find the container with id 21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.528150 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a462e608-8ce9-449b-897d-c5fd47649a86" path="/var/lib/kubelet/pods/a462e608-8ce9-449b-897d-c5fd47649a86/volumes" Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.834022 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-f97vt" event={"ID":"b4f98249-4c60-4cce-bfbd-e517303feee5","Type":"ContainerStarted","Data":"302ef6fc16dcc0333d1e457e53a8362b71103d268db7749bef2db7459679525d"} Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.834082 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-f97vt" event={"ID":"b4f98249-4c60-4cce-bfbd-e517303feee5","Type":"ContainerStarted","Data":"21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d"} Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.836985 4733 generic.go:334] "Generic (PLEG): container finished" podID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerID="272c4dd152d4b3a9076680418d460ea77947bd49bcd7bad05b5bbb147d575c09" exitCode=0 Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.837038 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" event={"ID":"2fff7d32-f796-45d1-a13b-1b2286e593c3","Type":"ContainerDied","Data":"272c4dd152d4b3a9076680418d460ea77947bd49bcd7bad05b5bbb147d575c09"} Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.837064 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" event={"ID":"2fff7d32-f796-45d1-a13b-1b2286e593c3","Type":"ContainerStarted","Data":"a96735a2a975970b6e1a1c1e61e0305541aaa1419d360ba5031f88c5e2d16585"} Dec 06 05:58:58 crc kubenswrapper[4733]: I1206 05:58:58.868610 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2ztw7-config-f97vt" podStartSLOduration=1.8685910159999999 podStartE2EDuration="1.868591016s" podCreationTimestamp="2025-12-06 05:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:58:58.858731186 +0000 UTC m=+922.723942297" watchObservedRunningTime="2025-12-06 05:58:58.868591016 +0000 UTC m=+922.733802127" Dec 06 05:58:59 crc kubenswrapper[4733]: I1206 05:58:59.854837 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4f98249-4c60-4cce-bfbd-e517303feee5" containerID="302ef6fc16dcc0333d1e457e53a8362b71103d268db7749bef2db7459679525d" exitCode=0 Dec 06 05:58:59 crc kubenswrapper[4733]: I1206 05:58:59.855016 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-f97vt" event={"ID":"b4f98249-4c60-4cce-bfbd-e517303feee5","Type":"ContainerDied","Data":"302ef6fc16dcc0333d1e457e53a8362b71103d268db7749bef2db7459679525d"} Dec 06 05:58:59 crc kubenswrapper[4733]: I1206 05:58:59.860990 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" event={"ID":"2fff7d32-f796-45d1-a13b-1b2286e593c3","Type":"ContainerStarted","Data":"324f7103a1749cb08562ac993230e58fa371d88936dff071694084ec26c4f1d2"} Dec 06 05:58:59 crc kubenswrapper[4733]: I1206 05:58:59.861183 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:58:59 crc kubenswrapper[4733]: I1206 05:58:59.895649 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" podStartSLOduration=2.895628537 podStartE2EDuration="2.895628537s" podCreationTimestamp="2025-12-06 05:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:58:59.890465157 +0000 UTC m=+923.755676268" watchObservedRunningTime="2025-12-06 05:58:59.895628537 +0000 UTC m=+923.760839648" Dec 06 05:59:02 crc kubenswrapper[4733]: I1206 05:59:02.220826 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2ztw7" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.366566 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.628837 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7gpvs"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.630022 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.633670 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gpvs"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.717192 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8hnm2"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.718779 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.732030 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8hnm2"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.742222 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-73db-account-create-update-ft7zq"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.744007 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.745808 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.752842 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-73db-account-create-update-ft7zq"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.769003 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbs6z\" (UniqueName: \"kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.769060 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.826827 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f762-account-create-update-phdmt"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.832009 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.833806 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.838067 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f762-account-create-update-phdmt"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.870835 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.870960 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cjf\" (UniqueName: \"kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.871183 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbs6z\" (UniqueName: \"kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.871339 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thwr6\" (UniqueName: \"kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.871490 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.871614 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.872397 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.887600 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbs6z\" (UniqueName: \"kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z\") pod \"cinder-db-create-7gpvs\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.924416 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-z9w82"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.925719 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.933571 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z9w82"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.954123 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.967490 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.973704 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.973757 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.973843 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mfs\" (UniqueName: \"kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.973867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cjf\" (UniqueName: \"kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.974006 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thwr6\" (UniqueName: \"kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.974035 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.975091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.975540 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.986542 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vrwfc"] Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.987739 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.990091 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.990270 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d5nb6" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.990439 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.990611 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.991139 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cjf\" (UniqueName: \"kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf\") pod \"barbican-db-create-8hnm2\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:03 crc kubenswrapper[4733]: I1206 05:59:03.998069 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thwr6\" (UniqueName: \"kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6\") pod \"cinder-73db-account-create-update-ft7zq\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.004725 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vrwfc"] Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.039529 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.047155 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-781a-account-create-update-9g84w"] Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.048597 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.051621 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.059684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.060317 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-781a-account-create-update-9g84w"] Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.075556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mfs\" (UniqueName: \"kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.075727 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.076665 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.076732 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8v6x\" (UniqueName: \"kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.077734 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.089260 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mfs\" (UniqueName: \"kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs\") pod \"barbican-f762-account-create-update-phdmt\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.150103 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.178948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179009 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l586c\" (UniqueName: \"kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179267 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179323 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8v6x\" (UniqueName: \"kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179423 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2tq\" (UniqueName: \"kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.179553 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.180299 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.193961 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8v6x\" (UniqueName: \"kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x\") pod \"neutron-db-create-z9w82\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.240988 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.280748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l586c\" (UniqueName: \"kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.280813 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.280868 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2tq\" (UniqueName: \"kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.281000 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.281028 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.282245 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.284157 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.285449 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.295909 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2tq\" (UniqueName: \"kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq\") pod \"neutron-781a-account-create-update-9g84w\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.302659 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l586c\" (UniqueName: \"kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c\") pod \"keystone-db-sync-vrwfc\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.362138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.370099 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.872831 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.874542 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.882481 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.992253 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.992371 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:04 crc kubenswrapper[4733]: I1206 05:59:04.992429 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxr4\" (UniqueName: \"kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.094861 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.094942 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxr4\" (UniqueName: \"kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.095124 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.095337 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.095568 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.117497 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxr4\" (UniqueName: \"kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4\") pod \"certified-operators-pp9vb\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:05 crc kubenswrapper[4733]: I1206 05:59:05.227296 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:07 crc kubenswrapper[4733]: I1206 05:59:07.477492 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:59:07 crc kubenswrapper[4733]: I1206 05:59:07.542692 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:59:07 crc kubenswrapper[4733]: I1206 05:59:07.542921 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="dnsmasq-dns" containerID="cri-o://12c65d99bf2519aa1372aa4e393f577ad266a0424ace9155a70da7812c3860f5" gracePeriod=10 Dec 06 05:59:07 crc kubenswrapper[4733]: I1206 05:59:07.952257 4733 generic.go:334] "Generic (PLEG): container finished" podID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerID="12c65d99bf2519aa1372aa4e393f577ad266a0424ace9155a70da7812c3860f5" exitCode=0 Dec 06 05:59:07 crc kubenswrapper[4733]: I1206 05:59:07.952334 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" event={"ID":"e99bfc04-55e0-4ff7-be22-22c6e9cc8100","Type":"ContainerDied","Data":"12c65d99bf2519aa1372aa4e393f577ad266a0424ace9155a70da7812c3860f5"} Dec 06 05:59:08 crc kubenswrapper[4733]: I1206 05:59:08.761024 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 06 05:59:09 crc kubenswrapper[4733]: E1206 05:59:09.716398 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos9/openstack-glance-api:2e38c527ddf6e767040136ecf014e7b9" Dec 06 05:59:09 crc kubenswrapper[4733]: E1206 05:59:09.716757 4733 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos9/openstack-glance-api:2e38c527ddf6e767040136ecf014e7b9" Dec 06 05:59:09 crc kubenswrapper[4733]: E1206 05:59:09.716912 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.rdoproject.org/podified-master-centos9/openstack-glance-api:2e38c527ddf6e767040136ecf014e7b9,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7vt7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-zz9tl_openstack(bc8e93b6-7230-41f1-98f5-18b252d0d724): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 05:59:09 crc kubenswrapper[4733]: E1206 05:59:09.718593 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-zz9tl" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.779971 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890258 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890633 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890696 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m24d\" (UniqueName: \"kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890737 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890803 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.890826 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts\") pod \"b4f98249-4c60-4cce-bfbd-e517303feee5\" (UID: \"b4f98249-4c60-4cce-bfbd-e517303feee5\") " Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.892565 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts" (OuterVolumeSpecName: "scripts") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.892794 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.892874 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run" (OuterVolumeSpecName: "var-run") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.893252 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.892843 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.899597 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d" (OuterVolumeSpecName: "kube-api-access-2m24d") pod "b4f98249-4c60-4cce-bfbd-e517303feee5" (UID: "b4f98249-4c60-4cce-bfbd-e517303feee5"). InnerVolumeSpecName "kube-api-access-2m24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.974346 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-f97vt" event={"ID":"b4f98249-4c60-4cce-bfbd-e517303feee5","Type":"ContainerDied","Data":"21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d"} Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.974395 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d28924dc8672f22b88bfbb2b25218ba850e51a7bdbe06b746b436b1519a00d" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.974468 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-f97vt" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.976941 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.984733 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" event={"ID":"e99bfc04-55e0-4ff7-be22-22c6e9cc8100","Type":"ContainerDied","Data":"3f430cc3eabcb1377646beaf6293a33e0dc0d9f133d57b0c218dab7c60a65836"} Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.984825 4733 scope.go:117] "RemoveContainer" containerID="12c65d99bf2519aa1372aa4e393f577ad266a0424ace9155a70da7812c3860f5" Dec 06 05:59:09 crc kubenswrapper[4733]: E1206 05:59:09.987766 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos9/openstack-glance-api:2e38c527ddf6e767040136ecf014e7b9\\\"\"" pod="openstack/glance-db-sync-zz9tl" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992626 4733 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992657 4733 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992668 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m24d\" (UniqueName: \"kubernetes.io/projected/b4f98249-4c60-4cce-bfbd-e517303feee5-kube-api-access-2m24d\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992678 4733 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992700 4733 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4f98249-4c60-4cce-bfbd-e517303feee5-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:09 crc kubenswrapper[4733]: I1206 05:59:09.992710 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4f98249-4c60-4cce-bfbd-e517303feee5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.024596 4733 scope.go:117] "RemoveContainer" containerID="8430a895840b043724f2787a4d77bd43cddfeb6bdecc52d1e0116db65ec0b949" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.094005 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7xsv\" (UniqueName: \"kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv\") pod \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.094378 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb\") pod \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.094411 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb\") pod \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.094629 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config\") pod \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.094664 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc\") pod \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\" (UID: \"e99bfc04-55e0-4ff7-be22-22c6e9cc8100\") " Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.097969 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv" (OuterVolumeSpecName: "kube-api-access-b7xsv") pod "e99bfc04-55e0-4ff7-be22-22c6e9cc8100" (UID: "e99bfc04-55e0-4ff7-be22-22c6e9cc8100"). InnerVolumeSpecName "kube-api-access-b7xsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.135567 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config" (OuterVolumeSpecName: "config") pod "e99bfc04-55e0-4ff7-be22-22c6e9cc8100" (UID: "e99bfc04-55e0-4ff7-be22-22c6e9cc8100"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.140446 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e99bfc04-55e0-4ff7-be22-22c6e9cc8100" (UID: "e99bfc04-55e0-4ff7-be22-22c6e9cc8100"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.149276 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e99bfc04-55e0-4ff7-be22-22c6e9cc8100" (UID: "e99bfc04-55e0-4ff7-be22-22c6e9cc8100"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.151419 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e99bfc04-55e0-4ff7-be22-22c6e9cc8100" (UID: "e99bfc04-55e0-4ff7-be22-22c6e9cc8100"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.198382 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.198430 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.198442 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.198456 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.198468 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7xsv\" (UniqueName: \"kubernetes.io/projected/e99bfc04-55e0-4ff7-be22-22c6e9cc8100-kube-api-access-b7xsv\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:10 crc kubenswrapper[4733]: W1206 05:59:10.334597 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f01ee5d_d8b9_4464_9ec9_ba71b0080b97.slice/crio-c6646f2d31c535229b214f68244561a2173880fa854f54ee8be544bac0b1b5a1 WatchSource:0}: Error finding container c6646f2d31c535229b214f68244561a2173880fa854f54ee8be544bac0b1b5a1: Status 404 returned error can't find the container with id c6646f2d31c535229b214f68244561a2173880fa854f54ee8be544bac0b1b5a1 Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.337120 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.407253 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f762-account-create-update-phdmt"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.420960 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z9w82"] Dec 06 05:59:10 crc kubenswrapper[4733]: W1206 05:59:10.422517 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8894ddef_6b07_452a_83fd_5d6383f280c2.slice/crio-5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5 WatchSource:0}: Error finding container 5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5: Status 404 returned error can't find the container with id 5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5 Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.427875 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8hnm2"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.441376 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gpvs"] Dec 06 05:59:10 crc kubenswrapper[4733]: W1206 05:59:10.447165 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab6d968_0d66_4b87_898b_d17fb0741af2.slice/crio-13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17 WatchSource:0}: Error finding container 13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17: Status 404 returned error can't find the container with id 13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17 Dec 06 05:59:10 crc kubenswrapper[4733]: W1206 05:59:10.447974 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1390ae4e_536b_4861_8591_fd8656976c12.slice/crio-ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98 WatchSource:0}: Error finding container ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98: Status 404 returned error can't find the container with id ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98 Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.450234 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-781a-account-create-update-9g84w"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.456182 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-73db-account-create-update-ft7zq"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.460218 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vrwfc"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.861813 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2ztw7-config-f97vt"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.869569 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2ztw7-config-f97vt"] Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.993376 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fb8b8965-bmdkf" Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.995427 4733 generic.go:334] "Generic (PLEG): container finished" podID="0c64b364-21c9-4fd9-a392-18b9ea6661fb" containerID="fd8358502a2ecea7390a3030de99a87d178869de8721351327526783c42c28ad" exitCode=0 Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.995584 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8hnm2" event={"ID":"0c64b364-21c9-4fd9-a392-18b9ea6661fb","Type":"ContainerDied","Data":"fd8358502a2ecea7390a3030de99a87d178869de8721351327526783c42c28ad"} Dec 06 05:59:10 crc kubenswrapper[4733]: I1206 05:59:10.995624 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8hnm2" event={"ID":"0c64b364-21c9-4fd9-a392-18b9ea6661fb","Type":"ContainerStarted","Data":"37896f8b739685257693d18e671eef57f6b8e581f6853a57e33608d8ebee06ab"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.000522 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2ztw7-config-7mnlw"] Dec 06 05:59:11 crc kubenswrapper[4733]: E1206 05:59:11.001553 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="init" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.001579 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="init" Dec 06 05:59:11 crc kubenswrapper[4733]: E1206 05:59:11.001626 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="dnsmasq-dns" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.001635 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="dnsmasq-dns" Dec 06 05:59:11 crc kubenswrapper[4733]: E1206 05:59:11.001660 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f98249-4c60-4cce-bfbd-e517303feee5" containerName="ovn-config" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.001666 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f98249-4c60-4cce-bfbd-e517303feee5" containerName="ovn-config" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.002338 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" containerName="dnsmasq-dns" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.002383 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f98249-4c60-4cce-bfbd-e517303feee5" containerName="ovn-config" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.001809 4733 generic.go:334] "Generic (PLEG): container finished" podID="31a0a2a8-5536-47b7-9f7c-2eef7e453214" containerID="1892bac900c092ddf75c756cd581f5c719f1c7dfff71b720507688405fdd4c4d" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.003331 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f762-account-create-update-phdmt" event={"ID":"31a0a2a8-5536-47b7-9f7c-2eef7e453214","Type":"ContainerDied","Data":"1892bac900c092ddf75c756cd581f5c719f1c7dfff71b720507688405fdd4c4d"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.003372 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f762-account-create-update-phdmt" event={"ID":"31a0a2a8-5536-47b7-9f7c-2eef7e453214","Type":"ContainerStarted","Data":"e80d211fa7dbb32ec0717a3571700e479b18f6a842594f3f8e93c046de335fc4"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.003567 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.005646 4733 generic.go:334] "Generic (PLEG): container finished" podID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerID="af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.005687 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.005733 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerDied","Data":"af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.005763 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerStarted","Data":"c6646f2d31c535229b214f68244561a2173880fa854f54ee8be544bac0b1b5a1"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.010451 4733 generic.go:334] "Generic (PLEG): container finished" podID="1aaa5a5a-871e-4022-9fea-20d5541424bf" containerID="19206309fec04412ba761c6c207a159b6dc569bf853150b4cec2ce4759bb33da" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.010582 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gpvs" event={"ID":"1aaa5a5a-871e-4022-9fea-20d5541424bf","Type":"ContainerDied","Data":"19206309fec04412ba761c6c207a159b6dc569bf853150b4cec2ce4759bb33da"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.010616 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gpvs" event={"ID":"1aaa5a5a-871e-4022-9fea-20d5541424bf","Type":"ContainerStarted","Data":"c87285fc1c347f1ef7bae26eb761d12001131cb1bc4725b1c3a326c7a9806d7a"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.011989 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73db-account-create-update-ft7zq" event={"ID":"dab6d968-0d66-4b87-898b-d17fb0741af2","Type":"ContainerDied","Data":"f5bf65b4769e24ee50da36f59cce71691a86fb0235331637b7918c385a4b3d03"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.012098 4733 generic.go:334] "Generic (PLEG): container finished" podID="dab6d968-0d66-4b87-898b-d17fb0741af2" containerID="f5bf65b4769e24ee50da36f59cce71691a86fb0235331637b7918c385a4b3d03" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.012172 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73db-account-create-update-ft7zq" event={"ID":"dab6d968-0d66-4b87-898b-d17fb0741af2","Type":"ContainerStarted","Data":"13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.013127 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vrwfc" event={"ID":"1390ae4e-536b-4861-8591-fd8656976c12","Type":"ContainerStarted","Data":"ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.014521 4733 generic.go:334] "Generic (PLEG): container finished" podID="8894ddef-6b07-452a-83fd-5d6383f280c2" containerID="2ce800bc38e2b250f951b941ae6274be8ddb46bea923a9310929d86a39df6ed7" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.014583 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z9w82" event={"ID":"8894ddef-6b07-452a-83fd-5d6383f280c2","Type":"ContainerDied","Data":"2ce800bc38e2b250f951b941ae6274be8ddb46bea923a9310929d86a39df6ed7"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.014601 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z9w82" event={"ID":"8894ddef-6b07-452a-83fd-5d6383f280c2","Type":"ContainerStarted","Data":"5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.015651 4733 generic.go:334] "Generic (PLEG): container finished" podID="d928009b-3db5-492c-8d4e-375ca54b6f8b" containerID="4afeeabadadf2184b6fa6d39c3633532a3b3159105f408e610e36b8c04a6e386" exitCode=0 Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.015680 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-781a-account-create-update-9g84w" event={"ID":"d928009b-3db5-492c-8d4e-375ca54b6f8b","Type":"ContainerDied","Data":"4afeeabadadf2184b6fa6d39c3633532a3b3159105f408e610e36b8c04a6e386"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.015695 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-781a-account-create-update-9g84w" event={"ID":"d928009b-3db5-492c-8d4e-375ca54b6f8b","Type":"ContainerStarted","Data":"b1c7189e82858d0049239adddfe0eafefdea6de268737f17e474e5467c882399"} Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.029345 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7-config-7mnlw"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124589 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6js\" (UniqueName: \"kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124686 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124784 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.124818 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227148 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227208 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6js\" (UniqueName: \"kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227249 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227282 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227425 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.227456 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.228136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.228346 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.232458 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.232636 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.234287 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.263534 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.264617 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6js\" (UniqueName: \"kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js\") pod \"ovn-controller-2ztw7-config-7mnlw\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.274023 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fb8b8965-bmdkf"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.448921 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.616952 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.622036 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.622192 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.737657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.737994 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.738119 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv766\" (UniqueName: \"kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.840649 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.840744 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.840900 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv766\" (UniqueName: \"kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.841100 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.841297 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.857576 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv766\" (UniqueName: \"kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766\") pod \"redhat-operators-nqsf6\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.888611 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2ztw7-config-7mnlw"] Dec 06 05:59:11 crc kubenswrapper[4733]: I1206 05:59:11.943684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.029635 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerStarted","Data":"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d"} Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.034483 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-7mnlw" event={"ID":"15fc6df2-423e-4ee7-a781-498c4d4cba96","Type":"ContainerStarted","Data":"98ae8e715db5315e7e43038ab6a98487e62216d197286edeb8ed4893ec67da82"} Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.500828 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f98249-4c60-4cce-bfbd-e517303feee5" path="/var/lib/kubelet/pods/b4f98249-4c60-4cce-bfbd-e517303feee5/volumes" Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.502189 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e99bfc04-55e0-4ff7-be22-22c6e9cc8100" path="/var/lib/kubelet/pods/e99bfc04-55e0-4ff7-be22-22c6e9cc8100/volumes" Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.704494 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:12 crc kubenswrapper[4733]: W1206 05:59:12.719541 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de7ee8b_f6b5_4014_a78f_bad2e92ddd2d.slice/crio-042a2410bc91b79069c6295a93f2441cd2a7521ae881ac085b12fa139d691a2c WatchSource:0}: Error finding container 042a2410bc91b79069c6295a93f2441cd2a7521ae881ac085b12fa139d691a2c: Status 404 returned error can't find the container with id 042a2410bc91b79069c6295a93f2441cd2a7521ae881ac085b12fa139d691a2c Dec 06 05:59:12 crc kubenswrapper[4733]: I1206 05:59:12.996871 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.005293 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.018390 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.031684 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.047165 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.049112 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.072003 4733 generic.go:334] "Generic (PLEG): container finished" podID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerID="0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d" exitCode=0 Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.072151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerDied","Data":"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.080323 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73db-account-create-update-ft7zq" event={"ID":"dab6d968-0d66-4b87-898b-d17fb0741af2","Type":"ContainerDied","Data":"13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.080356 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b50774f655b0ee8ab177c50e6d0fc69358a05b615852b1802e9254be416e17" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.080428 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73db-account-create-update-ft7zq" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.089003 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-781a-account-create-update-9g84w" event={"ID":"d928009b-3db5-492c-8d4e-375ca54b6f8b","Type":"ContainerDied","Data":"b1c7189e82858d0049239adddfe0eafefdea6de268737f17e474e5467c882399"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.089043 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1c7189e82858d0049239adddfe0eafefdea6de268737f17e474e5467c882399" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.089127 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-781a-account-create-update-9g84w" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.123173 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8hnm2" event={"ID":"0c64b364-21c9-4fd9-a392-18b9ea6661fb","Type":"ContainerDied","Data":"37896f8b739685257693d18e671eef57f6b8e581f6853a57e33608d8ebee06ab"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.123226 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37896f8b739685257693d18e671eef57f6b8e581f6853a57e33608d8ebee06ab" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.124180 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8hnm2" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.135319 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerStarted","Data":"042a2410bc91b79069c6295a93f2441cd2a7521ae881ac085b12fa139d691a2c"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.140585 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z9w82" event={"ID":"8894ddef-6b07-452a-83fd-5d6383f280c2","Type":"ContainerDied","Data":"5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.140643 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5caad073160ab0ff4ff9a59eccc19f77d0711a36cd745afacf2aa6cd95fb1cd5" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.140756 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z9w82" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.143262 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gpvs" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.143271 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gpvs" event={"ID":"1aaa5a5a-871e-4022-9fea-20d5541424bf","Type":"ContainerDied","Data":"c87285fc1c347f1ef7bae26eb761d12001131cb1bc4725b1c3a326c7a9806d7a"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.143481 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87285fc1c347f1ef7bae26eb761d12001131cb1bc4725b1c3a326c7a9806d7a" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.144787 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-7mnlw" event={"ID":"15fc6df2-423e-4ee7-a781-498c4d4cba96","Type":"ContainerStarted","Data":"9fa56c693dc430d0748d3ef5f09bf1591d19ecf2d60884842978547bd6d8fe4d"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.148177 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f762-account-create-update-phdmt" event={"ID":"31a0a2a8-5536-47b7-9f7c-2eef7e453214","Type":"ContainerDied","Data":"e80d211fa7dbb32ec0717a3571700e479b18f6a842594f3f8e93c046de335fc4"} Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.148203 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80d211fa7dbb32ec0717a3571700e479b18f6a842594f3f8e93c046de335fc4" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.148354 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f762-account-create-update-phdmt" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.168395 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts\") pod \"8894ddef-6b07-452a-83fd-5d6383f280c2\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.169201 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts\") pod \"dab6d968-0d66-4b87-898b-d17fb0741af2\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.176153 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts\") pod \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.177751 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d2tq\" (UniqueName: \"kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq\") pod \"d928009b-3db5-492c-8d4e-375ca54b6f8b\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.178001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts\") pod \"1aaa5a5a-871e-4022-9fea-20d5541424bf\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.169368 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8894ddef-6b07-452a-83fd-5d6383f280c2" (UID: "8894ddef-6b07-452a-83fd-5d6383f280c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.176076 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dab6d968-0d66-4b87-898b-d17fb0741af2" (UID: "dab6d968-0d66-4b87-898b-d17fb0741af2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.177063 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c64b364-21c9-4fd9-a392-18b9ea6661fb" (UID: "0c64b364-21c9-4fd9-a392-18b9ea6661fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.179035 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aaa5a5a-871e-4022-9fea-20d5541424bf" (UID: "1aaa5a5a-871e-4022-9fea-20d5541424bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.179697 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbs6z\" (UniqueName: \"kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z\") pod \"1aaa5a5a-871e-4022-9fea-20d5541424bf\" (UID: \"1aaa5a5a-871e-4022-9fea-20d5541424bf\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.179861 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8v6x\" (UniqueName: \"kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x\") pod \"8894ddef-6b07-452a-83fd-5d6383f280c2\" (UID: \"8894ddef-6b07-452a-83fd-5d6383f280c2\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.179892 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts\") pod \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.179998 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts\") pod \"d928009b-3db5-492c-8d4e-375ca54b6f8b\" (UID: \"d928009b-3db5-492c-8d4e-375ca54b6f8b\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.180038 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24cjf\" (UniqueName: \"kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf\") pod \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\" (UID: \"0c64b364-21c9-4fd9-a392-18b9ea6661fb\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.180100 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thwr6\" (UniqueName: \"kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6\") pod \"dab6d968-0d66-4b87-898b-d17fb0741af2\" (UID: \"dab6d968-0d66-4b87-898b-d17fb0741af2\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.180137 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6mfs\" (UniqueName: \"kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs\") pod \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\" (UID: \"31a0a2a8-5536-47b7-9f7c-2eef7e453214\") " Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181081 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d928009b-3db5-492c-8d4e-375ca54b6f8b" (UID: "d928009b-3db5-492c-8d4e-375ca54b6f8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181526 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aaa5a5a-871e-4022-9fea-20d5541424bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181547 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d928009b-3db5-492c-8d4e-375ca54b6f8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181559 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8894ddef-6b07-452a-83fd-5d6383f280c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181568 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab6d968-0d66-4b87-898b-d17fb0741af2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.181577 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64b364-21c9-4fd9-a392-18b9ea6661fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.184606 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31a0a2a8-5536-47b7-9f7c-2eef7e453214" (UID: "31a0a2a8-5536-47b7-9f7c-2eef7e453214"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.185176 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf" (OuterVolumeSpecName: "kube-api-access-24cjf") pod "0c64b364-21c9-4fd9-a392-18b9ea6661fb" (UID: "0c64b364-21c9-4fd9-a392-18b9ea6661fb"). InnerVolumeSpecName "kube-api-access-24cjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.187380 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs" (OuterVolumeSpecName: "kube-api-access-v6mfs") pod "31a0a2a8-5536-47b7-9f7c-2eef7e453214" (UID: "31a0a2a8-5536-47b7-9f7c-2eef7e453214"). InnerVolumeSpecName "kube-api-access-v6mfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.187750 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6" (OuterVolumeSpecName: "kube-api-access-thwr6") pod "dab6d968-0d66-4b87-898b-d17fb0741af2" (UID: "dab6d968-0d66-4b87-898b-d17fb0741af2"). InnerVolumeSpecName "kube-api-access-thwr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.190509 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x" (OuterVolumeSpecName: "kube-api-access-w8v6x") pod "8894ddef-6b07-452a-83fd-5d6383f280c2" (UID: "8894ddef-6b07-452a-83fd-5d6383f280c2"). InnerVolumeSpecName "kube-api-access-w8v6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.190654 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z" (OuterVolumeSpecName: "kube-api-access-bbs6z") pod "1aaa5a5a-871e-4022-9fea-20d5541424bf" (UID: "1aaa5a5a-871e-4022-9fea-20d5541424bf"). InnerVolumeSpecName "kube-api-access-bbs6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.202426 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq" (OuterVolumeSpecName: "kube-api-access-4d2tq") pod "d928009b-3db5-492c-8d4e-375ca54b6f8b" (UID: "d928009b-3db5-492c-8d4e-375ca54b6f8b"). InnerVolumeSpecName "kube-api-access-4d2tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284375 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d2tq\" (UniqueName: \"kubernetes.io/projected/d928009b-3db5-492c-8d4e-375ca54b6f8b-kube-api-access-4d2tq\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284431 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbs6z\" (UniqueName: \"kubernetes.io/projected/1aaa5a5a-871e-4022-9fea-20d5541424bf-kube-api-access-bbs6z\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284443 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8v6x\" (UniqueName: \"kubernetes.io/projected/8894ddef-6b07-452a-83fd-5d6383f280c2-kube-api-access-w8v6x\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284456 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a0a2a8-5536-47b7-9f7c-2eef7e453214-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284484 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24cjf\" (UniqueName: \"kubernetes.io/projected/0c64b364-21c9-4fd9-a392-18b9ea6661fb-kube-api-access-24cjf\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284521 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thwr6\" (UniqueName: \"kubernetes.io/projected/dab6d968-0d66-4b87-898b-d17fb0741af2-kube-api-access-thwr6\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:13 crc kubenswrapper[4733]: I1206 05:59:13.284545 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6mfs\" (UniqueName: \"kubernetes.io/projected/31a0a2a8-5536-47b7-9f7c-2eef7e453214-kube-api-access-v6mfs\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.156970 4733 generic.go:334] "Generic (PLEG): container finished" podID="15fc6df2-423e-4ee7-a781-498c4d4cba96" containerID="9fa56c693dc430d0748d3ef5f09bf1591d19ecf2d60884842978547bd6d8fe4d" exitCode=0 Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.157028 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-7mnlw" event={"ID":"15fc6df2-423e-4ee7-a781-498c4d4cba96","Type":"ContainerDied","Data":"9fa56c693dc430d0748d3ef5f09bf1591d19ecf2d60884842978547bd6d8fe4d"} Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.159196 4733 generic.go:334] "Generic (PLEG): container finished" podID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerID="2db9451a4cc2af862f2b764579a655f227056d2413cb46a0f18424193874dcad" exitCode=0 Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.159345 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerDied","Data":"2db9451a4cc2af862f2b764579a655f227056d2413cb46a0f18424193874dcad"} Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.165762 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerStarted","Data":"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830"} Dec 06 05:59:14 crc kubenswrapper[4733]: I1206 05:59:14.192147 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pp9vb" podStartSLOduration=7.660192412 podStartE2EDuration="10.192128457s" podCreationTimestamp="2025-12-06 05:59:04 +0000 UTC" firstStartedPulling="2025-12-06 05:59:11.011157652 +0000 UTC m=+934.876368763" lastFinishedPulling="2025-12-06 05:59:13.543093697 +0000 UTC m=+937.408304808" observedRunningTime="2025-12-06 05:59:14.189986951 +0000 UTC m=+938.055198061" watchObservedRunningTime="2025-12-06 05:59:14.192128457 +0000 UTC m=+938.057339568" Dec 06 05:59:15 crc kubenswrapper[4733]: I1206 05:59:15.228256 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:15 crc kubenswrapper[4733]: I1206 05:59:15.228340 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:15 crc kubenswrapper[4733]: I1206 05:59:15.274630 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.717451 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.880722 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.881432 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.881566 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.881631 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.881658 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.881705 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6js\" (UniqueName: \"kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js\") pod \"15fc6df2-423e-4ee7-a781-498c4d4cba96\" (UID: \"15fc6df2-423e-4ee7-a781-498c4d4cba96\") " Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.882262 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.883100 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.883203 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.889376 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts" (OuterVolumeSpecName: "scripts") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.882732 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run" (OuterVolumeSpecName: "var-run") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.889843 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js" (OuterVolumeSpecName: "kube-api-access-sq6js") pod "15fc6df2-423e-4ee7-a781-498c4d4cba96" (UID: "15fc6df2-423e-4ee7-a781-498c4d4cba96"). InnerVolumeSpecName "kube-api-access-sq6js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983294 4733 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983348 4733 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983357 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983365 4733 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15fc6df2-423e-4ee7-a781-498c4d4cba96-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983378 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6js\" (UniqueName: \"kubernetes.io/projected/15fc6df2-423e-4ee7-a781-498c4d4cba96-kube-api-access-sq6js\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:17 crc kubenswrapper[4733]: I1206 05:59:17.983389 4733 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6df2-423e-4ee7-a781-498c4d4cba96-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.206055 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2ztw7-config-7mnlw" event={"ID":"15fc6df2-423e-4ee7-a781-498c4d4cba96","Type":"ContainerDied","Data":"98ae8e715db5315e7e43038ab6a98487e62216d197286edeb8ed4893ec67da82"} Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.206093 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2ztw7-config-7mnlw" Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.206099 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ae8e715db5315e7e43038ab6a98487e62216d197286edeb8ed4893ec67da82" Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.208701 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerStarted","Data":"9b88afa520e6a18af4521bad6826cba138fef0975cd25ea70754dbc3b42dbf7a"} Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.210706 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vrwfc" event={"ID":"1390ae4e-536b-4861-8591-fd8656976c12","Type":"ContainerStarted","Data":"8054c6b414f7564652019519e29f98b783b0de4560cf4c7e536de2c7c65e204c"} Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.271531 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vrwfc" podStartSLOduration=8.019609486 podStartE2EDuration="15.271516223s" podCreationTimestamp="2025-12-06 05:59:03 +0000 UTC" firstStartedPulling="2025-12-06 05:59:10.45725018 +0000 UTC m=+934.322461291" lastFinishedPulling="2025-12-06 05:59:17.709156917 +0000 UTC m=+941.574368028" observedRunningTime="2025-12-06 05:59:18.237496792 +0000 UTC m=+942.102707903" watchObservedRunningTime="2025-12-06 05:59:18.271516223 +0000 UTC m=+942.136727334" Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.784783 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2ztw7-config-7mnlw"] Dec 06 05:59:18 crc kubenswrapper[4733]: I1206 05:59:18.790294 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2ztw7-config-7mnlw"] Dec 06 05:59:20 crc kubenswrapper[4733]: I1206 05:59:20.227571 4733 generic.go:334] "Generic (PLEG): container finished" podID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerID="9b88afa520e6a18af4521bad6826cba138fef0975cd25ea70754dbc3b42dbf7a" exitCode=0 Dec 06 05:59:20 crc kubenswrapper[4733]: I1206 05:59:20.227916 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerDied","Data":"9b88afa520e6a18af4521bad6826cba138fef0975cd25ea70754dbc3b42dbf7a"} Dec 06 05:59:20 crc kubenswrapper[4733]: I1206 05:59:20.494531 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fc6df2-423e-4ee7-a781-498c4d4cba96" path="/var/lib/kubelet/pods/15fc6df2-423e-4ee7-a781-498c4d4cba96/volumes" Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.239850 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerStarted","Data":"5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c"} Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.244618 4733 generic.go:334] "Generic (PLEG): container finished" podID="1390ae4e-536b-4861-8591-fd8656976c12" containerID="8054c6b414f7564652019519e29f98b783b0de4560cf4c7e536de2c7c65e204c" exitCode=0 Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.244659 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vrwfc" event={"ID":"1390ae4e-536b-4861-8591-fd8656976c12","Type":"ContainerDied","Data":"8054c6b414f7564652019519e29f98b783b0de4560cf4c7e536de2c7c65e204c"} Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.255994 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nqsf6" podStartSLOduration=3.725750264 podStartE2EDuration="10.255976446s" podCreationTimestamp="2025-12-06 05:59:11 +0000 UTC" firstStartedPulling="2025-12-06 05:59:14.161104119 +0000 UTC m=+938.026315220" lastFinishedPulling="2025-12-06 05:59:20.691330291 +0000 UTC m=+944.556541402" observedRunningTime="2025-12-06 05:59:21.253583177 +0000 UTC m=+945.118794288" watchObservedRunningTime="2025-12-06 05:59:21.255976446 +0000 UTC m=+945.121187557" Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.944850 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:21 crc kubenswrapper[4733]: I1206 05:59:21.944914 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.517561 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.578567 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data\") pod \"1390ae4e-536b-4861-8591-fd8656976c12\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.578706 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle\") pod \"1390ae4e-536b-4861-8591-fd8656976c12\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.578842 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l586c\" (UniqueName: \"kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c\") pod \"1390ae4e-536b-4861-8591-fd8656976c12\" (UID: \"1390ae4e-536b-4861-8591-fd8656976c12\") " Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.584665 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c" (OuterVolumeSpecName: "kube-api-access-l586c") pod "1390ae4e-536b-4861-8591-fd8656976c12" (UID: "1390ae4e-536b-4861-8591-fd8656976c12"). InnerVolumeSpecName "kube-api-access-l586c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.599061 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1390ae4e-536b-4861-8591-fd8656976c12" (UID: "1390ae4e-536b-4861-8591-fd8656976c12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.612469 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data" (OuterVolumeSpecName: "config-data") pod "1390ae4e-536b-4861-8591-fd8656976c12" (UID: "1390ae4e-536b-4861-8591-fd8656976c12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.681118 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l586c\" (UniqueName: \"kubernetes.io/projected/1390ae4e-536b-4861-8591-fd8656976c12-kube-api-access-l586c\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.681156 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.681168 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1390ae4e-536b-4861-8591-fd8656976c12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:22 crc kubenswrapper[4733]: I1206 05:59:22.980422 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nqsf6" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" probeResult="failure" output=< Dec 06 05:59:22 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Dec 06 05:59:22 crc kubenswrapper[4733]: > Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.262200 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vrwfc" event={"ID":"1390ae4e-536b-4861-8591-fd8656976c12","Type":"ContainerDied","Data":"ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98"} Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.262256 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca4a3f7a36b5663c6e5abe8b128e2b47f11f9335f18e7330e88a760c55af9a98" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.262352 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vrwfc" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522389 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522789 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fc6df2-423e-4ee7-a781-498c4d4cba96" containerName="ovn-config" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522804 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fc6df2-423e-4ee7-a781-498c4d4cba96" containerName="ovn-config" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522831 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aaa5a5a-871e-4022-9fea-20d5541424bf" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522838 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aaa5a5a-871e-4022-9fea-20d5541424bf" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522851 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab6d968-0d66-4b87-898b-d17fb0741af2" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522857 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab6d968-0d66-4b87-898b-d17fb0741af2" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522865 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c64b364-21c9-4fd9-a392-18b9ea6661fb" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522871 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c64b364-21c9-4fd9-a392-18b9ea6661fb" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522886 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a0a2a8-5536-47b7-9f7c-2eef7e453214" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522893 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a0a2a8-5536-47b7-9f7c-2eef7e453214" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522903 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8894ddef-6b07-452a-83fd-5d6383f280c2" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522909 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8894ddef-6b07-452a-83fd-5d6383f280c2" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522923 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1390ae4e-536b-4861-8591-fd8656976c12" containerName="keystone-db-sync" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522929 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1390ae4e-536b-4861-8591-fd8656976c12" containerName="keystone-db-sync" Dec 06 05:59:23 crc kubenswrapper[4733]: E1206 05:59:23.522939 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d928009b-3db5-492c-8d4e-375ca54b6f8b" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.522944 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d928009b-3db5-492c-8d4e-375ca54b6f8b" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523136 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aaa5a5a-871e-4022-9fea-20d5541424bf" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523147 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8894ddef-6b07-452a-83fd-5d6383f280c2" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523159 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab6d968-0d66-4b87-898b-d17fb0741af2" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523170 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1390ae4e-536b-4861-8591-fd8656976c12" containerName="keystone-db-sync" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523184 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a0a2a8-5536-47b7-9f7c-2eef7e453214" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523194 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d928009b-3db5-492c-8d4e-375ca54b6f8b" containerName="mariadb-account-create-update" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523207 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c64b364-21c9-4fd9-a392-18b9ea6661fb" containerName="mariadb-database-create" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.523220 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fc6df2-423e-4ee7-a781-498c4d4cba96" containerName="ovn-config" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.524155 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.530784 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.548383 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ck6x6"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.549436 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.551831 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.552107 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.552368 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.552604 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.553182 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d5nb6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.592848 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ck6x6"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.597952 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598023 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkmn\" (UniqueName: \"kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598047 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598075 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598118 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598145 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9sj8\" (UniqueName: \"kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598175 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598226 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598242 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598256 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598296 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.598340 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.696592 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.699471 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700688 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700771 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkmn\" (UniqueName: \"kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700795 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700824 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700869 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700888 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9sj8\" (UniqueName: \"kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700932 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700966 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.700984 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.701003 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.701051 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.701075 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.701841 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.704881 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.705044 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.705628 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.708828 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.711625 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.712856 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.717811 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.718482 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.719635 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.721484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9sj8\" (UniqueName: \"kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.723644 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.735932 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data\") pod \"keystone-bootstrap-ck6x6\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.758200 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkmn\" (UniqueName: \"kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn\") pod \"dnsmasq-dns-766ccc677c-vwwzt\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.762815 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.793938 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vfkdm"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.795231 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.797236 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d8mpb" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.797524 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.797815 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804172 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804214 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804268 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804299 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804326 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804350 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804366 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9fs4\" (UniqueName: \"kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804393 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804429 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804480 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804502 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804520 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkpsd\" (UniqueName: \"kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.804535 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.823587 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vfkdm"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.848075 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.868836 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.896529 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cqh6m"] Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.898632 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.906946 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907024 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907090 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907179 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkpsd\" (UniqueName: \"kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907202 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907238 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907259 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907314 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7v6h\" (UniqueName: \"kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907363 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907403 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907437 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907464 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907482 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9fs4\" (UniqueName: \"kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.907513 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.910834 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.911753 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hhb4n" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.911770 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.911890 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.915616 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.918578 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.918818 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.922468 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.934586 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.937865 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkpsd\" (UniqueName: \"kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.940608 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9fs4\" (UniqueName: \"kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.948702 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.951615 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.952056 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.952892 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data\") pod \"cinder-db-sync-vfkdm\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.976460 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " pod="openstack/ceilometer-0" Dec 06 05:59:23 crc kubenswrapper[4733]: I1206 05:59:23.981194 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cqh6m"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.005315 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8ffc7"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.007278 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.010448 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7v6h\" (UniqueName: \"kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.010601 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.010715 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.013393 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-n8dgk"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.014804 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.017256 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.018107 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.018234 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l4sr9" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.019516 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hbvtr" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.019691 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.019817 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.024445 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.027607 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8ffc7"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.041556 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.048569 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n8dgk"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.055358 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.055948 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7v6h\" (UniqueName: \"kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h\") pod \"neutron-db-sync-cqh6m\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.056919 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.072858 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115102 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115770 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115851 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115893 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115920 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xzl\" (UniqueName: \"kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.115984 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j44l\" (UniqueName: \"kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.116027 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.116089 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.130018 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230179 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230207 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xzl\" (UniqueName: \"kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230229 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230251 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230278 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230296 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j44l\" (UniqueName: \"kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230335 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230388 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230415 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8g2x\" (UniqueName: \"kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230436 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230468 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.230484 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.238746 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.238972 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.245966 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.246861 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.252102 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.259922 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.260885 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xzl\" (UniqueName: \"kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl\") pod \"placement-db-sync-n8dgk\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.275402 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j44l\" (UniqueName: \"kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l\") pod \"barbican-db-sync-8ffc7\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.309225 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336152 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8g2x\" (UniqueName: \"kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336200 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336233 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336286 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.336352 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.337404 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.338853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.339728 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.342406 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.342501 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.355495 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8g2x\" (UniqueName: \"kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x\") pod \"dnsmasq-dns-779f467bc5-7chvb\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.410863 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.456644 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.494287 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.629528 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ck6x6"] Dec 06 05:59:24 crc kubenswrapper[4733]: W1206 05:59:24.644382 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0 WatchSource:0}: Error finding container 423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0: Status 404 returned error can't find the container with id 423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0 Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.705805 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.858979 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.871929 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vfkdm"] Dec 06 05:59:24 crc kubenswrapper[4733]: I1206 05:59:24.989893 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cqh6m"] Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.077040 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.085256 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8ffc7"] Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.105147 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n8dgk"] Dec 06 05:59:25 crc kubenswrapper[4733]: W1206 05:59:25.143983 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87d7b344_b58b_459a_89b3_7f09319c5a73.slice/crio-a549474bacfbcd0abcfdef93e22a1b04ec51d95f421db8add5ebf223e1c6de44 WatchSource:0}: Error finding container a549474bacfbcd0abcfdef93e22a1b04ec51d95f421db8add5ebf223e1c6de44: Status 404 returned error can't find the container with id a549474bacfbcd0abcfdef93e22a1b04ec51d95f421db8add5ebf223e1c6de44 Dec 06 05:59:25 crc kubenswrapper[4733]: W1206 05:59:25.145241 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707d8771_4a40_42e7_b4bd_c2a9090126f0.slice/crio-85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7 WatchSource:0}: Error finding container 85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7: Status 404 returned error can't find the container with id 85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7 Dec 06 05:59:25 crc kubenswrapper[4733]: W1206 05:59:25.146902 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod363efbb6_18f2_440b_bffd_f64dee6a3af7.slice/crio-f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb WatchSource:0}: Error finding container f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb: Status 404 returned error can't find the container with id f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.281273 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.288712 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8dgk" event={"ID":"363efbb6-18f2-440b-bffd-f64dee6a3af7","Type":"ContainerStarted","Data":"f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.290935 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8ffc7" event={"ID":"707d8771-4a40-42e7-b4bd-c2a9090126f0","Type":"ContainerStarted","Data":"85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.291958 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerStarted","Data":"81918e52ed8a85c2b58bb0ee03c9fc00767dc78c2bc96b2f6d03a1cd0af60545"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.293166 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck6x6" event={"ID":"706be213-5f03-414a-bdeb-98af90de90f4","Type":"ContainerStarted","Data":"1d2f948c405e31df4b5efe4581472878fcb2ebb353a87bde24b07372abe5927e"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.293294 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck6x6" event={"ID":"706be213-5f03-414a-bdeb-98af90de90f4","Type":"ContainerStarted","Data":"423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.294095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vfkdm" event={"ID":"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c","Type":"ContainerStarted","Data":"06b4a2a7e0aefdf89b0f4e967d569de47fe05d467a94c0f5e5a6dac209a6ee13"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.295417 4733 generic.go:334] "Generic (PLEG): container finished" podID="1054e20c-7d23-4252-8162-6d088f90a2bb" containerID="8969b9e0f66d4108303365968c42e95e45417c94607a7105f9ddb3b751be5a52" exitCode=0 Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.295487 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" event={"ID":"1054e20c-7d23-4252-8162-6d088f90a2bb","Type":"ContainerDied","Data":"8969b9e0f66d4108303365968c42e95e45417c94607a7105f9ddb3b751be5a52"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.295506 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" event={"ID":"1054e20c-7d23-4252-8162-6d088f90a2bb","Type":"ContainerStarted","Data":"0fabe73638982a8d832064ba4ea237cf9502843637e12cd699de44806d8ed9ad"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.300515 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cqh6m" event={"ID":"94621b3f-341c-4c06-8530-91d11c1ad8dc","Type":"ContainerStarted","Data":"a1e14f2c214fdf14361a7ec6ee8f1a2fa6578de72e8cb11775b194710d8b0ca5"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.300587 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cqh6m" event={"ID":"94621b3f-341c-4c06-8530-91d11c1ad8dc","Type":"ContainerStarted","Data":"e525a336c0b0e4af843c396f1522cca3cd6de913875de4767e64a1e48be03737"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.302200 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zz9tl" event={"ID":"bc8e93b6-7230-41f1-98f5-18b252d0d724","Type":"ContainerStarted","Data":"5750aba6e8db5431cdd23b1d4e950f8f205015ea4f37fbeff6cd344a29a4bae1"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.304235 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" event={"ID":"87d7b344-b58b-459a-89b3-7f09319c5a73","Type":"ContainerStarted","Data":"a549474bacfbcd0abcfdef93e22a1b04ec51d95f421db8add5ebf223e1c6de44"} Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.383453 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ck6x6" podStartSLOduration=2.383435075 podStartE2EDuration="2.383435075s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:25.318344549 +0000 UTC m=+949.183555650" watchObservedRunningTime="2025-12-06 05:59:25.383435075 +0000 UTC m=+949.248646186" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.405407 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.409885 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zz9tl" podStartSLOduration=2.2800581380000002 podStartE2EDuration="38.409870565s" podCreationTimestamp="2025-12-06 05:58:47 +0000 UTC" firstStartedPulling="2025-12-06 05:58:48.178950675 +0000 UTC m=+912.044161777" lastFinishedPulling="2025-12-06 05:59:24.308763092 +0000 UTC m=+948.173974204" observedRunningTime="2025-12-06 05:59:25.358273542 +0000 UTC m=+949.223484653" watchObservedRunningTime="2025-12-06 05:59:25.409870565 +0000 UTC m=+949.275081676" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.411082 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cqh6m" podStartSLOduration=2.411078215 podStartE2EDuration="2.411078215s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:25.371728212 +0000 UTC m=+949.236939323" watchObservedRunningTime="2025-12-06 05:59:25.411078215 +0000 UTC m=+949.276289327" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.624597 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779028 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779192 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779233 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779272 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkkmn\" (UniqueName: \"kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779336 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.779492 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc\") pod \"1054e20c-7d23-4252-8162-6d088f90a2bb\" (UID: \"1054e20c-7d23-4252-8162-6d088f90a2bb\") " Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.801530 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn" (OuterVolumeSpecName: "kube-api-access-jkkmn") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "kube-api-access-jkkmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.801964 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.815004 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.827916 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config" (OuterVolumeSpecName: "config") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.828063 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.839601 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1054e20c-7d23-4252-8162-6d088f90a2bb" (UID: "1054e20c-7d23-4252-8162-6d088f90a2bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884850 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884878 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884888 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkkmn\" (UniqueName: \"kubernetes.io/projected/1054e20c-7d23-4252-8162-6d088f90a2bb-kube-api-access-jkkmn\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884899 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884908 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:25 crc kubenswrapper[4733]: I1206 05:59:25.884916 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1054e20c-7d23-4252-8162-6d088f90a2bb-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.066486 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.334213 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.334209 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766ccc677c-vwwzt" event={"ID":"1054e20c-7d23-4252-8162-6d088f90a2bb","Type":"ContainerDied","Data":"0fabe73638982a8d832064ba4ea237cf9502843637e12cd699de44806d8ed9ad"} Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.334389 4733 scope.go:117] "RemoveContainer" containerID="8969b9e0f66d4108303365968c42e95e45417c94607a7105f9ddb3b751be5a52" Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.341148 4733 generic.go:334] "Generic (PLEG): container finished" podID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerID="e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f" exitCode=0 Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.341283 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" event={"ID":"87d7b344-b58b-459a-89b3-7f09319c5a73","Type":"ContainerDied","Data":"e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f"} Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.341628 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pp9vb" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="registry-server" containerID="cri-o://aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830" gracePeriod=2 Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.533455 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:26 crc kubenswrapper[4733]: I1206 05:59:26.545144 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-766ccc677c-vwwzt"] Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.013788 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.130832 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities\") pod \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.131250 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content\") pod \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.131353 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btxr4\" (UniqueName: \"kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4\") pod \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\" (UID: \"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97\") " Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.132343 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities" (OuterVolumeSpecName: "utilities") pod "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" (UID: "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.136519 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4" (OuterVolumeSpecName: "kube-api-access-btxr4") pod "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" (UID: "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97"). InnerVolumeSpecName "kube-api-access-btxr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.212220 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" (UID: "8f01ee5d-d8b9-4464-9ec9-ba71b0080b97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.234507 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btxr4\" (UniqueName: \"kubernetes.io/projected/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-kube-api-access-btxr4\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.234546 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.234557 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.375422 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" event={"ID":"87d7b344-b58b-459a-89b3-7f09319c5a73","Type":"ContainerStarted","Data":"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067"} Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.377646 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.382178 4733 generic.go:334] "Generic (PLEG): container finished" podID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerID="aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830" exitCode=0 Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.382229 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerDied","Data":"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830"} Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.382257 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pp9vb" event={"ID":"8f01ee5d-d8b9-4464-9ec9-ba71b0080b97","Type":"ContainerDied","Data":"c6646f2d31c535229b214f68244561a2173880fa854f54ee8be544bac0b1b5a1"} Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.382275 4733 scope.go:117] "RemoveContainer" containerID="aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.382925 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pp9vb" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.400996 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" podStartSLOduration=4.400981738 podStartE2EDuration="4.400981738s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:27.392955745 +0000 UTC m=+951.258166856" watchObservedRunningTime="2025-12-06 05:59:27.400981738 +0000 UTC m=+951.266192869" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.431674 4733 scope.go:117] "RemoveContainer" containerID="0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.453685 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.462402 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pp9vb"] Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.490863 4733 scope.go:117] "RemoveContainer" containerID="af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.584998 4733 scope.go:117] "RemoveContainer" containerID="aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830" Dec 06 05:59:27 crc kubenswrapper[4733]: E1206 05:59:27.586044 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830\": container with ID starting with aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830 not found: ID does not exist" containerID="aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.586097 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830"} err="failed to get container status \"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830\": rpc error: code = NotFound desc = could not find container \"aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830\": container with ID starting with aeaa6f6d07f8d0c1e39936caeaa2f95c41bb76e4817843142e0d8f3adcc8b830 not found: ID does not exist" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.586128 4733 scope.go:117] "RemoveContainer" containerID="0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d" Dec 06 05:59:27 crc kubenswrapper[4733]: E1206 05:59:27.587759 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d\": container with ID starting with 0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d not found: ID does not exist" containerID="0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.587857 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d"} err="failed to get container status \"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d\": rpc error: code = NotFound desc = could not find container \"0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d\": container with ID starting with 0d8efbd5743d628845032956c952c612f031e503cfc61df4d4374802c983529d not found: ID does not exist" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.587930 4733 scope.go:117] "RemoveContainer" containerID="af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95" Dec 06 05:59:27 crc kubenswrapper[4733]: E1206 05:59:27.588573 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95\": container with ID starting with af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95 not found: ID does not exist" containerID="af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95" Dec 06 05:59:27 crc kubenswrapper[4733]: I1206 05:59:27.588662 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95"} err="failed to get container status \"af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95\": rpc error: code = NotFound desc = could not find container \"af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95\": container with ID starting with af7ac0c227129e38a0e24fe43b27dfe99d1fd504450d31b46a60e371651c6c95 not found: ID does not exist" Dec 06 05:59:28 crc kubenswrapper[4733]: I1206 05:59:28.495153 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1054e20c-7d23-4252-8162-6d088f90a2bb" path="/var/lib/kubelet/pods/1054e20c-7d23-4252-8162-6d088f90a2bb/volumes" Dec 06 05:59:28 crc kubenswrapper[4733]: I1206 05:59:28.495768 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" path="/var/lib/kubelet/pods/8f01ee5d-d8b9-4464-9ec9-ba71b0080b97/volumes" Dec 06 05:59:29 crc kubenswrapper[4733]: I1206 05:59:29.403894 4733 generic.go:334] "Generic (PLEG): container finished" podID="706be213-5f03-414a-bdeb-98af90de90f4" containerID="1d2f948c405e31df4b5efe4581472878fcb2ebb353a87bde24b07372abe5927e" exitCode=0 Dec 06 05:59:29 crc kubenswrapper[4733]: I1206 05:59:29.403979 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck6x6" event={"ID":"706be213-5f03-414a-bdeb-98af90de90f4","Type":"ContainerDied","Data":"1d2f948c405e31df4b5efe4581472878fcb2ebb353a87bde24b07372abe5927e"} Dec 06 05:59:31 crc kubenswrapper[4733]: I1206 05:59:31.424790 4733 generic.go:334] "Generic (PLEG): container finished" podID="bc8e93b6-7230-41f1-98f5-18b252d0d724" containerID="5750aba6e8db5431cdd23b1d4e950f8f205015ea4f37fbeff6cd344a29a4bae1" exitCode=0 Dec 06 05:59:31 crc kubenswrapper[4733]: I1206 05:59:31.424873 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zz9tl" event={"ID":"bc8e93b6-7230-41f1-98f5-18b252d0d724","Type":"ContainerDied","Data":"5750aba6e8db5431cdd23b1d4e950f8f205015ea4f37fbeff6cd344a29a4bae1"} Dec 06 05:59:31 crc kubenswrapper[4733]: I1206 05:59:31.990605 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:32 crc kubenswrapper[4733]: I1206 05:59:32.036692 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:32 crc kubenswrapper[4733]: I1206 05:59:32.225724 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:33 crc kubenswrapper[4733]: I1206 05:59:33.444105 4733 generic.go:334] "Generic (PLEG): container finished" podID="94621b3f-341c-4c06-8530-91d11c1ad8dc" containerID="a1e14f2c214fdf14361a7ec6ee8f1a2fa6578de72e8cb11775b194710d8b0ca5" exitCode=0 Dec 06 05:59:33 crc kubenswrapper[4733]: I1206 05:59:33.444187 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cqh6m" event={"ID":"94621b3f-341c-4c06-8530-91d11c1ad8dc","Type":"ContainerDied","Data":"a1e14f2c214fdf14361a7ec6ee8f1a2fa6578de72e8cb11775b194710d8b0ca5"} Dec 06 05:59:33 crc kubenswrapper[4733]: I1206 05:59:33.444643 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nqsf6" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" containerID="cri-o://5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" gracePeriod=2 Dec 06 05:59:34 crc kubenswrapper[4733]: I1206 05:59:34.458067 4733 generic.go:334] "Generic (PLEG): container finished" podID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerID="5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" exitCode=0 Dec 06 05:59:34 crc kubenswrapper[4733]: I1206 05:59:34.458164 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerDied","Data":"5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c"} Dec 06 05:59:34 crc kubenswrapper[4733]: I1206 05:59:34.496292 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 05:59:34 crc kubenswrapper[4733]: I1206 05:59:34.549615 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:59:34 crc kubenswrapper[4733]: I1206 05:59:34.549838 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" containerID="cri-o://324f7103a1749cb08562ac993230e58fa371d88936dff071694084ec26c4f1d2" gracePeriod=10 Dec 06 05:59:35 crc kubenswrapper[4733]: I1206 05:59:35.476372 4733 generic.go:334] "Generic (PLEG): container finished" podID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerID="324f7103a1749cb08562ac993230e58fa371d88936dff071694084ec26c4f1d2" exitCode=0 Dec 06 05:59:35 crc kubenswrapper[4733]: I1206 05:59:35.476458 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" event={"ID":"2fff7d32-f796-45d1-a13b-1b2286e593c3","Type":"ContainerDied","Data":"324f7103a1749cb08562ac993230e58fa371d88936dff071694084ec26c4f1d2"} Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.006984 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.016279 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zz9tl" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169361 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vt7t\" (UniqueName: \"kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t\") pod \"bc8e93b6-7230-41f1-98f5-18b252d0d724\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169434 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169467 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9sj8\" (UniqueName: \"kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169495 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169510 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle\") pod \"bc8e93b6-7230-41f1-98f5-18b252d0d724\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169540 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169563 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169580 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle\") pod \"706be213-5f03-414a-bdeb-98af90de90f4\" (UID: \"706be213-5f03-414a-bdeb-98af90de90f4\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169625 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data\") pod \"bc8e93b6-7230-41f1-98f5-18b252d0d724\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.169642 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data\") pod \"bc8e93b6-7230-41f1-98f5-18b252d0d724\" (UID: \"bc8e93b6-7230-41f1-98f5-18b252d0d724\") " Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.203769 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bc8e93b6-7230-41f1-98f5-18b252d0d724" (UID: "bc8e93b6-7230-41f1-98f5-18b252d0d724"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.216761 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts" (OuterVolumeSpecName: "scripts") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.217773 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.219556 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t" (OuterVolumeSpecName: "kube-api-access-7vt7t") pod "bc8e93b6-7230-41f1-98f5-18b252d0d724" (UID: "bc8e93b6-7230-41f1-98f5-18b252d0d724"). InnerVolumeSpecName "kube-api-access-7vt7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.223646 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8" (OuterVolumeSpecName: "kube-api-access-z9sj8") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "kube-api-access-z9sj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.240701 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.257448 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data" (OuterVolumeSpecName: "config-data") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275230 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vt7t\" (UniqueName: \"kubernetes.io/projected/bc8e93b6-7230-41f1-98f5-18b252d0d724-kube-api-access-7vt7t\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275355 4733 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275443 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9sj8\" (UniqueName: \"kubernetes.io/projected/706be213-5f03-414a-bdeb-98af90de90f4-kube-api-access-z9sj8\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275519 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275595 4733 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275650 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.275719 4733 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.295271 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc8e93b6-7230-41f1-98f5-18b252d0d724" (UID: "bc8e93b6-7230-41f1-98f5-18b252d0d724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.324154 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706be213-5f03-414a-bdeb-98af90de90f4" (UID: "706be213-5f03-414a-bdeb-98af90de90f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.346395 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data" (OuterVolumeSpecName: "config-data") pod "bc8e93b6-7230-41f1-98f5-18b252d0d724" (UID: "bc8e93b6-7230-41f1-98f5-18b252d0d724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.379529 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.379557 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706be213-5f03-414a-bdeb-98af90de90f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.379566 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8e93b6-7230-41f1-98f5-18b252d0d724-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.492292 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zz9tl" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.494578 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zz9tl" event={"ID":"bc8e93b6-7230-41f1-98f5-18b252d0d724","Type":"ContainerDied","Data":"7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af"} Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.494620 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cfca0dcb17c44785b5ff7a5c510d8c81bf2ed4bd5ad2d1702dc4e5d0a69c9af" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.495916 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck6x6" event={"ID":"706be213-5f03-414a-bdeb-98af90de90f4","Type":"ContainerDied","Data":"423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0"} Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.495960 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0" Dec 06 05:59:36 crc kubenswrapper[4733]: I1206 05:59:36.496003 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck6x6" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.105032 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ck6x6"] Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.112959 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ck6x6"] Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192086 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4twth"] Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192559 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="extract-content" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192580 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="extract-content" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192597 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1054e20c-7d23-4252-8162-6d088f90a2bb" containerName="init" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192604 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1054e20c-7d23-4252-8162-6d088f90a2bb" containerName="init" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192617 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" containerName="glance-db-sync" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192623 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" containerName="glance-db-sync" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192640 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="registry-server" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192646 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="registry-server" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192655 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706be213-5f03-414a-bdeb-98af90de90f4" containerName="keystone-bootstrap" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192661 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="706be213-5f03-414a-bdeb-98af90de90f4" containerName="keystone-bootstrap" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.192672 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="extract-utilities" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192688 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="extract-utilities" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192898 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="706be213-5f03-414a-bdeb-98af90de90f4" containerName="keystone-bootstrap" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192913 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" containerName="glance-db-sync" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192923 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f01ee5d-d8b9-4464-9ec9-ba71b0080b97" containerName="registry-server" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.192934 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1054e20c-7d23-4252-8162-6d088f90a2bb" containerName="init" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.193631 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.196074 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.196177 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.196424 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.196614 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.197528 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d5nb6" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.203086 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4twth"] Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305655 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305727 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305797 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305819 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305838 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvsq8\" (UniqueName: \"kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.305937 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.396937 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.398387 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.408681 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.408847 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.408902 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.408986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.409012 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.409036 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvsq8\" (UniqueName: \"kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.409340 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.417872 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.417894 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.418315 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.419128 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.422882 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.427563 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvsq8\" (UniqueName: \"kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8\") pod \"keystone-bootstrap-4twth\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.480612 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.120:5353: connect: connection refused" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510293 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510352 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510392 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdpj\" (UniqueName: \"kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510472 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510497 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.510531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.519701 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612027 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612200 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612249 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612499 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdpj\" (UniqueName: \"kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612825 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612886 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.612955 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.613205 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.613972 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.614012 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.614053 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.627782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdpj\" (UniqueName: \"kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj\") pod \"dnsmasq-dns-579bfdb9c5-65jql\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:37 crc kubenswrapper[4733]: E1206 05:59:37.672044 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache]" Dec 06 05:59:37 crc kubenswrapper[4733]: I1206 05:59:37.802211 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.316327 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.318664 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.321978 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.322196 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.322396 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t2xs9" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.327937 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.432653 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wd6\" (UniqueName: \"kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.432814 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.432842 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.432926 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.432981 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.433055 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.433254 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.493906 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706be213-5f03-414a-bdeb-98af90de90f4" path="/var/lib/kubelet/pods/706be213-5f03-414a-bdeb-98af90de90f4/volumes" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.512859 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.514531 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.517079 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.520685 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.534874 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.534938 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wd6\" (UniqueName: \"kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.534991 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.535009 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.535033 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.535061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.535093 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.536727 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.538172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.538712 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.542943 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.546724 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.551871 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.558089 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wd6\" (UniqueName: \"kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.583464 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637353 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637418 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637451 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637479 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637574 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637600 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhx9k\" (UniqueName: \"kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.637625 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.670789 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.697097 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740660 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740721 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740811 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740839 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhx9k\" (UniqueName: \"kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.740863 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.741624 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.741856 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.742814 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.752923 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.753383 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.769131 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.771875 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhx9k\" (UniqueName: \"kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.832526 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.839026 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.841898 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7v6h\" (UniqueName: \"kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h\") pod \"94621b3f-341c-4c06-8530-91d11c1ad8dc\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.841997 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle\") pod \"94621b3f-341c-4c06-8530-91d11c1ad8dc\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.842076 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config\") pod \"94621b3f-341c-4c06-8530-91d11c1ad8dc\" (UID: \"94621b3f-341c-4c06-8530-91d11c1ad8dc\") " Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.858196 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h" (OuterVolumeSpecName: "kube-api-access-m7v6h") pod "94621b3f-341c-4c06-8530-91d11c1ad8dc" (UID: "94621b3f-341c-4c06-8530-91d11c1ad8dc"). InnerVolumeSpecName "kube-api-access-m7v6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.893614 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94621b3f-341c-4c06-8530-91d11c1ad8dc" (UID: "94621b3f-341c-4c06-8530-91d11c1ad8dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.893684 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config" (OuterVolumeSpecName: "config") pod "94621b3f-341c-4c06-8530-91d11c1ad8dc" (UID: "94621b3f-341c-4c06-8530-91d11c1ad8dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.944322 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7v6h\" (UniqueName: \"kubernetes.io/projected/94621b3f-341c-4c06-8530-91d11c1ad8dc-kube-api-access-m7v6h\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.944352 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:38 crc kubenswrapper[4733]: I1206 05:59:38.944364 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94621b3f-341c-4c06-8530-91d11c1ad8dc-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.537875 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cqh6m" event={"ID":"94621b3f-341c-4c06-8530-91d11c1ad8dc","Type":"ContainerDied","Data":"e525a336c0b0e4af843c396f1522cca3cd6de913875de4767e64a1e48be03737"} Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.537914 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e525a336c0b0e4af843c396f1522cca3cd6de913875de4767e64a1e48be03737" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.537991 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cqh6m" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.562222 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.617050 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.894163 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.922568 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:39 crc kubenswrapper[4733]: E1206 05:59:39.923030 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94621b3f-341c-4c06-8530-91d11c1ad8dc" containerName="neutron-db-sync" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.923051 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="94621b3f-341c-4c06-8530-91d11c1ad8dc" containerName="neutron-db-sync" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.923289 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="94621b3f-341c-4c06-8530-91d11c1ad8dc" containerName="neutron-db-sync" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.924226 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:39 crc kubenswrapper[4733]: I1206 05:59:39.933751 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.045376 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.046777 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.052931 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.053143 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hhb4n" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.053382 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.055923 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.059146 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069391 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069442 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069498 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069607 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069653 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.069697 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hxq\" (UniqueName: \"kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.170842 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.170891 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.170927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hxq\" (UniqueName: \"kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.170956 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.170978 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171016 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171034 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171094 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171113 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l265j\" (UniqueName: \"kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171163 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.171962 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.172504 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.173317 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.173795 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.174324 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.188749 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hxq\" (UniqueName: \"kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq\") pod \"dnsmasq-dns-5c769b985f-kk46z\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.246154 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.273357 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.273397 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.273476 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.273505 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l265j\" (UniqueName: \"kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.273556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.278657 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.278830 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.279111 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.279719 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.289755 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l265j\" (UniqueName: \"kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j\") pod \"neutron-544f978b4d-7s676\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:40 crc kubenswrapper[4733]: I1206 05:59:40.361933 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:41 crc kubenswrapper[4733]: E1206 05:59:41.946344 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c is running failed: container process not found" containerID="5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 05:59:41 crc kubenswrapper[4733]: E1206 05:59:41.947494 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c is running failed: container process not found" containerID="5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 05:59:41 crc kubenswrapper[4733]: E1206 05:59:41.947831 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c is running failed: container process not found" containerID="5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 05:59:41 crc kubenswrapper[4733]: E1206 05:59:41.947854 4733 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-nqsf6" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.388595 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dbbf764c5-qntcx"] Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.390177 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.393850 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.394123 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.401932 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dbbf764c5-qntcx"] Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.559630 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-ovndb-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.559769 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-combined-ca-bundle\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.559808 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.559884 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6r4\" (UniqueName: \"kubernetes.io/projected/96238eea-ea50-4c05-a33c-ae44b8c7a055-kube-api-access-fg6r4\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.559950 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-public-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.560023 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-internal-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.560145 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-httpd-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661049 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-ovndb-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661125 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-combined-ca-bundle\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661164 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661217 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6r4\" (UniqueName: \"kubernetes.io/projected/96238eea-ea50-4c05-a33c-ae44b8c7a055-kube-api-access-fg6r4\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661254 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-public-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661292 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-internal-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.661346 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-httpd-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669052 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-httpd-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-internal-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-combined-ca-bundle\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669329 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-ovndb-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669821 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-config\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.669869 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96238eea-ea50-4c05-a33c-ae44b8c7a055-public-tls-certs\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.676881 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6r4\" (UniqueName: \"kubernetes.io/projected/96238eea-ea50-4c05-a33c-ae44b8c7a055-kube-api-access-fg6r4\") pod \"neutron-7dbbf764c5-qntcx\" (UID: \"96238eea-ea50-4c05-a33c-ae44b8c7a055\") " pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:43 crc kubenswrapper[4733]: I1206 05:59:43.720708 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.584617 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.588448 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.615843 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqsf6" event={"ID":"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d","Type":"ContainerDied","Data":"042a2410bc91b79069c6295a93f2441cd2a7521ae881ac085b12fa139d691a2c"} Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.615917 4733 scope.go:117] "RemoveContainer" containerID="5dedf9f117dc352810d3da1ae4ee162c7e6dd9af333ee62950750576d5ef7f8c" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.616094 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqsf6" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.620999 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" event={"ID":"2fff7d32-f796-45d1-a13b-1b2286e593c3","Type":"ContainerDied","Data":"a96735a2a975970b6e1a1c1e61e0305541aaa1419d360ba5031f88c5e2d16585"} Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.621096 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727614 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities\") pod \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727701 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv766\" (UniqueName: \"kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766\") pod \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727786 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqgd7\" (UniqueName: \"kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727832 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727917 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content\") pod \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\" (UID: \"8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.727985 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.728155 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.728215 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.728244 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc\") pod \"2fff7d32-f796-45d1-a13b-1b2286e593c3\" (UID: \"2fff7d32-f796-45d1-a13b-1b2286e593c3\") " Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.729244 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities" (OuterVolumeSpecName: "utilities") pod "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" (UID: "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.743781 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7" (OuterVolumeSpecName: "kube-api-access-wqgd7") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "kube-api-access-wqgd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.743847 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766" (OuterVolumeSpecName: "kube-api-access-cv766") pod "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" (UID: "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d"). InnerVolumeSpecName "kube-api-access-cv766". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.763156 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.765802 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.768767 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config" (OuterVolumeSpecName: "config") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.776679 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.776878 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2fff7d32-f796-45d1-a13b-1b2286e593c3" (UID: "2fff7d32-f796-45d1-a13b-1b2286e593c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.810142 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" (UID: "8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831665 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831698 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831710 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831722 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831733 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv766\" (UniqueName: \"kubernetes.io/projected/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-kube-api-access-cv766\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831746 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqgd7\" (UniqueName: \"kubernetes.io/projected/2fff7d32-f796-45d1-a13b-1b2286e593c3-kube-api-access-wqgd7\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831755 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831766 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.831775 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fff7d32-f796-45d1-a13b-1b2286e593c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.967967 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.974950 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nqsf6"] Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.979868 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:59:46 crc kubenswrapper[4733]: I1206 05:59:46.984619 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8c486c6ff-fq7t7"] Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.476602 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8c486c6ff-fq7t7" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.120:5353: i/o timeout" Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.516664 4733 scope.go:117] "RemoveContainer" containerID="9b88afa520e6a18af4521bad6826cba138fef0975cd25ea70754dbc3b42dbf7a" Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.713789 4733 scope.go:117] "RemoveContainer" containerID="2db9451a4cc2af862f2b764579a655f227056d2413cb46a0f18424193874dcad" Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.747493 4733 scope.go:117] "RemoveContainer" containerID="324f7103a1749cb08562ac993230e58fa371d88936dff071694084ec26c4f1d2" Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.777482 4733 scope.go:117] "RemoveContainer" containerID="272c4dd152d4b3a9076680418d460ea77947bd49bcd7bad05b5bbb147d575c09" Dec 06 05:59:47 crc kubenswrapper[4733]: I1206 05:59:47.899317 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:47 crc kubenswrapper[4733]: E1206 05:59:47.916651 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache]" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.053612 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.071952 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4twth"] Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.119677 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.261853 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:48 crc kubenswrapper[4733]: W1206 05:59:48.273264 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda26939cd_f665_482c_b577_efab49fe0123.slice/crio-c7be0d61bf81df36a11662339ebe6d8efa94d15f83c2b0b263861ae0bcd91d30 WatchSource:0}: Error finding container c7be0d61bf81df36a11662339ebe6d8efa94d15f83c2b0b263861ae0bcd91d30: Status 404 returned error can't find the container with id c7be0d61bf81df36a11662339ebe6d8efa94d15f83c2b0b263861ae0bcd91d30 Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.331328 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dbbf764c5-qntcx"] Dec 06 05:59:48 crc kubenswrapper[4733]: W1206 05:59:48.419926 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96238eea_ea50_4c05_a33c_ae44b8c7a055.slice/crio-8aa8486e5b205d3ca019f6950afab34ac875007caa41792a4079091ea314bb3a WatchSource:0}: Error finding container 8aa8486e5b205d3ca019f6950afab34ac875007caa41792a4079091ea314bb3a: Status 404 returned error can't find the container with id 8aa8486e5b205d3ca019f6950afab34ac875007caa41792a4079091ea314bb3a Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.501362 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" path="/var/lib/kubelet/pods/2fff7d32-f796-45d1-a13b-1b2286e593c3/volumes" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.501957 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" path="/var/lib/kubelet/pods/8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d/volumes" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.670083 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vfkdm" event={"ID":"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c","Type":"ContainerStarted","Data":"5507ac46044992ff14b7680c2d1b081f93275189c934a95c96b480bc08b4e154"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.674496 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8dgk" event={"ID":"363efbb6-18f2-440b-bffd-f64dee6a3af7","Type":"ContainerStarted","Data":"d1f5a6ed6b68bb0115a0cd1a5a5f9ab57234e6da84cef20890527aef598c0df9"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.678627 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8ffc7" event={"ID":"707d8771-4a40-42e7-b4bd-c2a9090126f0","Type":"ContainerStarted","Data":"2d7bf11943f17abd016cd025b1a22748e5ae393cd0598a7b1659f8b3a6fa6f3c"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.683992 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vfkdm" podStartSLOduration=2.994011961 podStartE2EDuration="25.6839753s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="2025-12-06 05:59:24.891591839 +0000 UTC m=+948.756802951" lastFinishedPulling="2025-12-06 05:59:47.581555179 +0000 UTC m=+971.446766290" observedRunningTime="2025-12-06 05:59:48.681868268 +0000 UTC m=+972.547079379" watchObservedRunningTime="2025-12-06 05:59:48.6839753 +0000 UTC m=+972.549186411" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.685715 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerStarted","Data":"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.690066 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4twth" event={"ID":"6a886728-ea9b-485c-844d-964614315b0d","Type":"ContainerStarted","Data":"f3c5a5a99dadaafae87ae1fdd011992e264df1cfa144803ecf8b4458d2785aad"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.690133 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4twth" event={"ID":"6a886728-ea9b-485c-844d-964614315b0d","Type":"ContainerStarted","Data":"db219d663b591d70eb8e58d2b83b123b68aae8961a80c2a2a4c37d858b022227"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.697446 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerStarted","Data":"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.697495 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerStarted","Data":"a55cada98f98ee4f3783ee6ea5cecf4af2ecd70dadf320423655a4be9f340b54"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.699700 4733 generic.go:334] "Generic (PLEG): container finished" podID="af780532-6391-4d7f-93d4-fb966d5e0434" containerID="5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02" exitCode=0 Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.699783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" event={"ID":"af780532-6391-4d7f-93d4-fb966d5e0434","Type":"ContainerDied","Data":"5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.699806 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" event={"ID":"af780532-6391-4d7f-93d4-fb966d5e0434","Type":"ContainerStarted","Data":"f38ff97edf3c3dde287bf672753b7071afffbe6d49f94a7b916062b39e9eea71"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.706368 4733 generic.go:334] "Generic (PLEG): container finished" podID="7f723d96-b1b3-4395-a2d0-577550dad098" containerID="0531460f6116793d7c88cf3406fcc49bb08bf49dd2c7f1d992d02afebdf8cf41" exitCode=0 Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.706445 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" event={"ID":"7f723d96-b1b3-4395-a2d0-577550dad098","Type":"ContainerDied","Data":"0531460f6116793d7c88cf3406fcc49bb08bf49dd2c7f1d992d02afebdf8cf41"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.706469 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" event={"ID":"7f723d96-b1b3-4395-a2d0-577550dad098","Type":"ContainerStarted","Data":"c7e916356385746e05ec92dfc66b1db2434424e7f612638a42930c0c707aa7aa"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.706796 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-n8dgk" podStartSLOduration=3.4579381 podStartE2EDuration="25.706781034s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="2025-12-06 05:59:25.157633706 +0000 UTC m=+949.022844817" lastFinishedPulling="2025-12-06 05:59:47.406476641 +0000 UTC m=+971.271687751" observedRunningTime="2025-12-06 05:59:48.694201589 +0000 UTC m=+972.559412701" watchObservedRunningTime="2025-12-06 05:59:48.706781034 +0000 UTC m=+972.571992146" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.714774 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8ffc7" podStartSLOduration=4.382601505 podStartE2EDuration="25.714763605s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="2025-12-06 05:59:25.154518679 +0000 UTC m=+949.019729790" lastFinishedPulling="2025-12-06 05:59:46.486680788 +0000 UTC m=+970.351891890" observedRunningTime="2025-12-06 05:59:48.712979229 +0000 UTC m=+972.578190340" watchObservedRunningTime="2025-12-06 05:59:48.714763605 +0000 UTC m=+972.579974716" Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.715032 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dbbf764c5-qntcx" event={"ID":"96238eea-ea50-4c05-a33c-ae44b8c7a055","Type":"ContainerStarted","Data":"8aa8486e5b205d3ca019f6950afab34ac875007caa41792a4079091ea314bb3a"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.718766 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerStarted","Data":"c7be0d61bf81df36a11662339ebe6d8efa94d15f83c2b0b263861ae0bcd91d30"} Dec 06 05:59:48 crc kubenswrapper[4733]: I1206 05:59:48.727502 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4twth" podStartSLOduration=11.727395167 podStartE2EDuration="11.727395167s" podCreationTimestamp="2025-12-06 05:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:48.725278456 +0000 UTC m=+972.590489568" watchObservedRunningTime="2025-12-06 05:59:48.727395167 +0000 UTC m=+972.592606278" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.020483 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.138839 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190136 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190176 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190355 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wdpj\" (UniqueName: \"kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190390 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190926 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.190960 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb\") pod \"7f723d96-b1b3-4395-a2d0-577550dad098\" (UID: \"7f723d96-b1b3-4395-a2d0-577550dad098\") " Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.202444 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj" (OuterVolumeSpecName: "kube-api-access-9wdpj") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "kube-api-access-9wdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.216927 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.235460 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.237819 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.238317 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config" (OuterVolumeSpecName: "config") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.239117 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f723d96-b1b3-4395-a2d0-577550dad098" (UID: "7f723d96-b1b3-4395-a2d0-577550dad098"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293159 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wdpj\" (UniqueName: \"kubernetes.io/projected/7f723d96-b1b3-4395-a2d0-577550dad098-kube-api-access-9wdpj\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293194 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293204 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293277 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293289 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.293314 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f723d96-b1b3-4395-a2d0-577550dad098-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.732570 4733 generic.go:334] "Generic (PLEG): container finished" podID="363efbb6-18f2-440b-bffd-f64dee6a3af7" containerID="d1f5a6ed6b68bb0115a0cd1a5a5f9ab57234e6da84cef20890527aef598c0df9" exitCode=0 Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.732660 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8dgk" event={"ID":"363efbb6-18f2-440b-bffd-f64dee6a3af7","Type":"ContainerDied","Data":"d1f5a6ed6b68bb0115a0cd1a5a5f9ab57234e6da84cef20890527aef598c0df9"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.734837 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dbbf764c5-qntcx" event={"ID":"96238eea-ea50-4c05-a33c-ae44b8c7a055","Type":"ContainerStarted","Data":"7791f96a6dbf6dd275b15ab53dede1bf38f5c1f60dd6d72e1a450d5b83bfaccf"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.734915 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dbbf764c5-qntcx" event={"ID":"96238eea-ea50-4c05-a33c-ae44b8c7a055","Type":"ContainerStarted","Data":"f152a87a87ad4dceee404f25183f0d7d287504fde5897f322a317c9ef995bf20"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.734947 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.736883 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" event={"ID":"af780532-6391-4d7f-93d4-fb966d5e0434","Type":"ContainerStarted","Data":"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.737021 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.739025 4733 generic.go:334] "Generic (PLEG): container finished" podID="707d8771-4a40-42e7-b4bd-c2a9090126f0" containerID="2d7bf11943f17abd016cd025b1a22748e5ae393cd0598a7b1659f8b3a6fa6f3c" exitCode=0 Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.739074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8ffc7" event={"ID":"707d8771-4a40-42e7-b4bd-c2a9090126f0","Type":"ContainerDied","Data":"2d7bf11943f17abd016cd025b1a22748e5ae393cd0598a7b1659f8b3a6fa6f3c"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.741735 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerStarted","Data":"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.741870 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-544f978b4d-7s676" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.746554 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerStarted","Data":"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.746586 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerStarted","Data":"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.746687 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-log" containerID="cri-o://0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" gracePeriod=30 Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.746913 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-httpd" containerID="cri-o://02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" gracePeriod=30 Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.755224 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" event={"ID":"7f723d96-b1b3-4395-a2d0-577550dad098","Type":"ContainerDied","Data":"c7e916356385746e05ec92dfc66b1db2434424e7f612638a42930c0c707aa7aa"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.755265 4733 scope.go:117] "RemoveContainer" containerID="0531460f6116793d7c88cf3406fcc49bb08bf49dd2c7f1d992d02afebdf8cf41" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.755422 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579bfdb9c5-65jql" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.770641 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" podStartSLOduration=10.77062765 podStartE2EDuration="10.77062765s" podCreationTimestamp="2025-12-06 05:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:49.762409998 +0000 UTC m=+973.627621108" watchObservedRunningTime="2025-12-06 05:59:49.77062765 +0000 UTC m=+973.635838760" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.789404 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-544f978b4d-7s676" podStartSLOduration=9.789390682 podStartE2EDuration="9.789390682s" podCreationTimestamp="2025-12-06 05:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:49.779237691 +0000 UTC m=+973.644448802" watchObservedRunningTime="2025-12-06 05:59:49.789390682 +0000 UTC m=+973.654601794" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.789725 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerStarted","Data":"71323a7b6f7b73beb72c56131f80d6ccf950f993fbdc3b0237f46280d00b8029"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.790260 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerStarted","Data":"ede8779286e464c2720099671184095528dd519d6dc428300e6e0ed284ac5d96"} Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.821638 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.821621991 podStartE2EDuration="12.821621991s" podCreationTimestamp="2025-12-06 05:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:49.800223833 +0000 UTC m=+973.665434945" watchObservedRunningTime="2025-12-06 05:59:49.821621991 +0000 UTC m=+973.686833102" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.822410 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dbbf764c5-qntcx" podStartSLOduration=6.822404412 podStartE2EDuration="6.822404412s" podCreationTimestamp="2025-12-06 05:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:49.820701459 +0000 UTC m=+973.685912571" watchObservedRunningTime="2025-12-06 05:59:49.822404412 +0000 UTC m=+973.687615523" Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.903734 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:49 crc kubenswrapper[4733]: I1206 05:59:49.906012 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-579bfdb9c5-65jql"] Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.498320 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f723d96-b1b3-4395-a2d0-577550dad098" path="/var/lib/kubelet/pods/7f723d96-b1b3-4395-a2d0-577550dad098/volumes" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.652376 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720269 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720360 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720440 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92wd6\" (UniqueName: \"kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720463 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720495 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720531 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.721160 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle\") pod \"a26939cd-f665-482c-b577-efab49fe0123\" (UID: \"a26939cd-f665-482c-b577-efab49fe0123\") " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.720745 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.721260 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs" (OuterVolumeSpecName: "logs") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.721577 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-logs\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.721598 4733 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26939cd-f665-482c-b577-efab49fe0123-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.727739 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts" (OuterVolumeSpecName: "scripts") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.729795 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6" (OuterVolumeSpecName: "kube-api-access-92wd6") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "kube-api-access-92wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.745393 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.748597 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.777019 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data" (OuterVolumeSpecName: "config-data") pod "a26939cd-f665-482c-b577-efab49fe0123" (UID: "a26939cd-f665-482c-b577-efab49fe0123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800536 4733 generic.go:334] "Generic (PLEG): container finished" podID="a26939cd-f665-482c-b577-efab49fe0123" containerID="02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" exitCode=0 Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800583 4733 generic.go:334] "Generic (PLEG): container finished" podID="a26939cd-f665-482c-b577-efab49fe0123" containerID="0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" exitCode=143 Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800651 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerDied","Data":"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e"} Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800684 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerDied","Data":"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d"} Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800694 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26939cd-f665-482c-b577-efab49fe0123","Type":"ContainerDied","Data":"c7be0d61bf81df36a11662339ebe6d8efa94d15f83c2b0b263861ae0bcd91d30"} Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.800713 4733 scope.go:117] "RemoveContainer" containerID="02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.801098 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.808674 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerStarted","Data":"f9c4ef11713b71ef4ea347a3c18e05cea58167801bf0713fca23a9eb05a6dbe0"} Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.808768 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-log" containerID="cri-o://71323a7b6f7b73beb72c56131f80d6ccf950f993fbdc3b0237f46280d00b8029" gracePeriod=30 Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.808849 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-httpd" containerID="cri-o://f9c4ef11713b71ef4ea347a3c18e05cea58167801bf0713fca23a9eb05a6dbe0" gracePeriod=30 Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.818616 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerStarted","Data":"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19"} Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.825931 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92wd6\" (UniqueName: \"kubernetes.io/projected/a26939cd-f665-482c-b577-efab49fe0123-kube-api-access-92wd6\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.825962 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.825972 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.825984 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26939cd-f665-482c-b577-efab49fe0123-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.832632 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.834384 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.834353965 podStartE2EDuration="13.834353965s" podCreationTimestamp="2025-12-06 05:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:50.832239388 +0000 UTC m=+974.697450500" watchObservedRunningTime="2025-12-06 05:59:50.834353965 +0000 UTC m=+974.699565086" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.860486 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.875315 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.875593 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.897654 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899291 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899346 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899413 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f723d96-b1b3-4395-a2d0-577550dad098" containerName="init" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899428 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f723d96-b1b3-4395-a2d0-577550dad098" containerName="init" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899441 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-httpd" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899447 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-httpd" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899459 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="init" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899483 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="init" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899495 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899503 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899512 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="extract-utilities" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899517 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="extract-utilities" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899529 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="extract-content" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899534 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="extract-content" Dec 06 05:59:50 crc kubenswrapper[4733]: E1206 05:59:50.899581 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-log" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899588 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-log" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899919 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f723d96-b1b3-4395-a2d0-577550dad098" containerName="init" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899939 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-httpd" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899949 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fff7d32-f796-45d1-a13b-1b2286e593c3" containerName="dnsmasq-dns" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899978 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de7ee8b-f6b5-4014-a78f-bad2e92ddd2d" containerName="registry-server" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.899989 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26939cd-f665-482c-b577-efab49fe0123" containerName="glance-log" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.901292 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.902525 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.908697 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.908706 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 05:59:50 crc kubenswrapper[4733]: I1206 05:59:50.938163 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039285 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039356 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039387 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039410 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039451 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039485 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039503 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhrk\" (UniqueName: \"kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.039521 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140518 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140562 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140609 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140626 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhrk\" (UniqueName: \"kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140643 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140738 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.140767 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.142130 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.148581 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.148742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.149372 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.149995 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.151534 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.152096 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.159204 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhrk\" (UniqueName: \"kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.176466 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.240786 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.826984 4733 generic.go:334] "Generic (PLEG): container finished" podID="3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" containerID="5507ac46044992ff14b7680c2d1b081f93275189c934a95c96b480bc08b4e154" exitCode=0 Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.827171 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vfkdm" event={"ID":"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c","Type":"ContainerDied","Data":"5507ac46044992ff14b7680c2d1b081f93275189c934a95c96b480bc08b4e154"} Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.829827 4733 generic.go:334] "Generic (PLEG): container finished" podID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerID="f9c4ef11713b71ef4ea347a3c18e05cea58167801bf0713fca23a9eb05a6dbe0" exitCode=0 Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.829910 4733 generic.go:334] "Generic (PLEG): container finished" podID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerID="71323a7b6f7b73beb72c56131f80d6ccf950f993fbdc3b0237f46280d00b8029" exitCode=143 Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.829975 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerDied","Data":"f9c4ef11713b71ef4ea347a3c18e05cea58167801bf0713fca23a9eb05a6dbe0"} Dec 06 05:59:51 crc kubenswrapper[4733]: I1206 05:59:51.830052 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerDied","Data":"71323a7b6f7b73beb72c56131f80d6ccf950f993fbdc3b0237f46280d00b8029"} Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.205445 4733 scope.go:117] "RemoveContainer" containerID="0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.338959 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.343894 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.399285 4733 scope.go:117] "RemoveContainer" containerID="02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.401049 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e\": container with ID starting with 02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e not found: ID does not exist" containerID="02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401089 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e"} err="failed to get container status \"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e\": rpc error: code = NotFound desc = could not find container \"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e\": container with ID starting with 02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e not found: ID does not exist" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401110 4733 scope.go:117] "RemoveContainer" containerID="0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.401426 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d\": container with ID starting with 0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d not found: ID does not exist" containerID="0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401450 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d"} err="failed to get container status \"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d\": rpc error: code = NotFound desc = could not find container \"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d\": container with ID starting with 0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d not found: ID does not exist" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401466 4733 scope.go:117] "RemoveContainer" containerID="02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401719 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e"} err="failed to get container status \"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e\": rpc error: code = NotFound desc = could not find container \"02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e\": container with ID starting with 02a5dbc852583f7f46e16efda9fc4232fc3e68d2d9b768cb0fed32e82cb6dc6e not found: ID does not exist" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401734 4733 scope.go:117] "RemoveContainer" containerID="0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.401970 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d"} err="failed to get container status \"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d\": rpc error: code = NotFound desc = could not find container \"0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d\": container with ID starting with 0b3c2cf15eac6795a7f23ce6f2547fd9a09695892cb668a3a7b47a51fe5dc03d not found: ID does not exist" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475214 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xzl\" (UniqueName: \"kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl\") pod \"363efbb6-18f2-440b-bffd-f64dee6a3af7\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475272 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle\") pod \"363efbb6-18f2-440b-bffd-f64dee6a3af7\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475500 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs\") pod \"363efbb6-18f2-440b-bffd-f64dee6a3af7\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475536 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data\") pod \"363efbb6-18f2-440b-bffd-f64dee6a3af7\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475586 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data\") pod \"707d8771-4a40-42e7-b4bd-c2a9090126f0\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475655 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts\") pod \"363efbb6-18f2-440b-bffd-f64dee6a3af7\" (UID: \"363efbb6-18f2-440b-bffd-f64dee6a3af7\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475681 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j44l\" (UniqueName: \"kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l\") pod \"707d8771-4a40-42e7-b4bd-c2a9090126f0\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.475702 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle\") pod \"707d8771-4a40-42e7-b4bd-c2a9090126f0\" (UID: \"707d8771-4a40-42e7-b4bd-c2a9090126f0\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.477380 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs" (OuterVolumeSpecName: "logs") pod "363efbb6-18f2-440b-bffd-f64dee6a3af7" (UID: "363efbb6-18f2-440b-bffd-f64dee6a3af7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.481068 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl" (OuterVolumeSpecName: "kube-api-access-l6xzl") pod "363efbb6-18f2-440b-bffd-f64dee6a3af7" (UID: "363efbb6-18f2-440b-bffd-f64dee6a3af7"). InnerVolumeSpecName "kube-api-access-l6xzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.483128 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts" (OuterVolumeSpecName: "scripts") pod "363efbb6-18f2-440b-bffd-f64dee6a3af7" (UID: "363efbb6-18f2-440b-bffd-f64dee6a3af7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.484141 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l" (OuterVolumeSpecName: "kube-api-access-5j44l") pod "707d8771-4a40-42e7-b4bd-c2a9090126f0" (UID: "707d8771-4a40-42e7-b4bd-c2a9090126f0"). InnerVolumeSpecName "kube-api-access-5j44l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.490493 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "707d8771-4a40-42e7-b4bd-c2a9090126f0" (UID: "707d8771-4a40-42e7-b4bd-c2a9090126f0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.496791 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26939cd-f665-482c-b577-efab49fe0123" path="/var/lib/kubelet/pods/a26939cd-f665-482c-b577-efab49fe0123/volumes" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.501319 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data" (OuterVolumeSpecName: "config-data") pod "363efbb6-18f2-440b-bffd-f64dee6a3af7" (UID: "363efbb6-18f2-440b-bffd-f64dee6a3af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.501430 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707d8771-4a40-42e7-b4bd-c2a9090126f0" (UID: "707d8771-4a40-42e7-b4bd-c2a9090126f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.504831 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "363efbb6-18f2-440b-bffd-f64dee6a3af7" (UID: "363efbb6-18f2-440b-bffd-f64dee6a3af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577880 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363efbb6-18f2-440b-bffd-f64dee6a3af7-logs\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577912 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577926 4733 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577937 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577946 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j44l\" (UniqueName: \"kubernetes.io/projected/707d8771-4a40-42e7-b4bd-c2a9090126f0-kube-api-access-5j44l\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577956 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707d8771-4a40-42e7-b4bd-c2a9090126f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577965 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xzl\" (UniqueName: \"kubernetes.io/projected/363efbb6-18f2-440b-bffd-f64dee6a3af7-kube-api-access-l6xzl\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.577973 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363efbb6-18f2-440b-bffd-f64dee6a3af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.666985 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780043 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780112 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhx9k\" (UniqueName: \"kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780143 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780225 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780244 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780271 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.780323 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run\") pod \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\" (UID: \"61d7ceb4-c6a6-407f-b208-693d62a8b76b\") " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.781270 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.781779 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs" (OuterVolumeSpecName: "logs") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.785338 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts" (OuterVolumeSpecName: "scripts") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.788387 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k" (OuterVolumeSpecName: "kube-api-access-dhx9k") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "kube-api-access-dhx9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.803863 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.809756 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.833739 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data" (OuterVolumeSpecName: "config-data") pod "61d7ceb4-c6a6-407f-b208-693d62a8b76b" (UID: "61d7ceb4-c6a6-407f-b208-693d62a8b76b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.842197 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8dgk" event={"ID":"363efbb6-18f2-440b-bffd-f64dee6a3af7","Type":"ContainerDied","Data":"f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb"} Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.842269 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4ae4ee516bd9415e499cb46b8de6d5fcc536a21baf3a3cafb412e4a3c595ebb" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.842398 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8dgk" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.849880 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8ffc7" event={"ID":"707d8771-4a40-42e7-b4bd-c2a9090126f0","Type":"ContainerDied","Data":"85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7"} Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.849922 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f1174ecebd0c56c24697b3227d6fe1f19f8d7be36c2bfff2ec63d85bb77da7" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.849995 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8ffc7" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.868769 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.868985 4733 generic.go:334] "Generic (PLEG): container finished" podID="6a886728-ea9b-485c-844d-964614315b0d" containerID="f3c5a5a99dadaafae87ae1fdd011992e264df1cfa144803ecf8b4458d2785aad" exitCode=0 Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.869047 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4twth" event={"ID":"6a886728-ea9b-485c-844d-964614315b0d","Type":"ContainerDied","Data":"f3c5a5a99dadaafae87ae1fdd011992e264df1cfa144803ecf8b4458d2785aad"} Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.873686 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"61d7ceb4-c6a6-407f-b208-693d62a8b76b","Type":"ContainerDied","Data":"ede8779286e464c2720099671184095528dd519d6dc428300e6e0ed284ac5d96"} Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.873748 4733 scope.go:117] "RemoveContainer" containerID="f9c4ef11713b71ef4ea347a3c18e05cea58167801bf0713fca23a9eb05a6dbe0" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.873768 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882261 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-logs\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882321 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882336 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882348 4733 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61d7ceb4-c6a6-407f-b208-693d62a8b76b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882358 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882368 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhx9k\" (UniqueName: \"kubernetes.io/projected/61d7ceb4-c6a6-407f-b208-693d62a8b76b-kube-api-access-dhx9k\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.882378 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d7ceb4-c6a6-407f-b208-693d62a8b76b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.913747 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.916059 4733 scope.go:117] "RemoveContainer" containerID="71323a7b6f7b73beb72c56131f80d6ccf950f993fbdc3b0237f46280d00b8029" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.930001 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.950102 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.959410 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.959912 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-httpd" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.959927 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-httpd" Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.959970 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-log" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.959976 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-log" Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.959986 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363efbb6-18f2-440b-bffd-f64dee6a3af7" containerName="placement-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.959993 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="363efbb6-18f2-440b-bffd-f64dee6a3af7" containerName="placement-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: E1206 05:59:52.960009 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707d8771-4a40-42e7-b4bd-c2a9090126f0" containerName="barbican-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.960015 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="707d8771-4a40-42e7-b4bd-c2a9090126f0" containerName="barbican-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.960193 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="707d8771-4a40-42e7-b4bd-c2a9090126f0" containerName="barbican-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.960209 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-log" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.960224 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="363efbb6-18f2-440b-bffd-f64dee6a3af7" containerName="placement-db-sync" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.960240 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" containerName="glance-httpd" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.961182 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.963254 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.963262 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.972918 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:52 crc kubenswrapper[4733]: I1206 05:59:52.985588 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088191 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088362 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088401 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088508 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088852 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvdt\" (UniqueName: \"kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088889 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.088952 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.190502 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.190538 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.190613 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.190645 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.191043 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvdt\" (UniqueName: \"kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.191098 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.191146 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.192554 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.192769 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.193668 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.193782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.204228 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.204245 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.204780 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.205721 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.207237 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvdt\" (UniqueName: \"kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.217297 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.221425 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.289156 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296268 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296327 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296511 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296555 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296603 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9fs4\" (UniqueName: \"kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.296726 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle\") pod \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\" (UID: \"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c\") " Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.297683 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.303032 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.303062 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4" (OuterVolumeSpecName: "kube-api-access-c9fs4") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "kube-api-access-c9fs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.304550 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts" (OuterVolumeSpecName: "scripts") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.319472 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.335974 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data" (OuterVolumeSpecName: "config-data") pod "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" (UID: "3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.398962 4733 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.398999 4733 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.399010 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9fs4\" (UniqueName: \"kubernetes.io/projected/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-kube-api-access-c9fs4\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.399025 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.399036 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.399046 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.500576 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5c84494675-5wvrl"] Dec 06 05:59:53 crc kubenswrapper[4733]: E1206 05:59:53.512624 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" containerName="cinder-db-sync" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.512651 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" containerName="cinder-db-sync" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.512901 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" containerName="cinder-db-sync" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.513972 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.517255 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.517921 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l4sr9" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.518609 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.525756 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c84494675-5wvrl"] Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.574448 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8696d9b56-5s4w8"] Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.576046 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.589940 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hbvtr" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.596370 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.596845 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 05:59:53 crc kubenswrapper[4733]: W1206 05:59:53.597009 4733 reflector.go:561] object-"openstack"/"cert-placement-public-svc": failed to list *v1.Secret: secrets "cert-placement-public-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 06 05:59:53 crc kubenswrapper[4733]: E1206 05:59:53.597043 4733 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-placement-public-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-placement-public-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.597597 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609235 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609430 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59cl\" (UniqueName: \"kubernetes.io/projected/0a1b0724-0e18-475b-9f9f-c96bf13e371a-kube-api-access-m59cl\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609467 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data-custom\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609492 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1b0724-0e18-475b-9f9f-c96bf13e371a-logs\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609536 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-combined-ca-bundle\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.609953 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6cb949947d-nv4s5"] Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.611735 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.617606 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.631327 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.631611 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="dnsmasq-dns" containerID="cri-o://58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb" gracePeriod=10 Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.636924 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.711849 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-internal-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.711908 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-combined-ca-bundle\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.711977 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-combined-ca-bundle\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712037 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e839961-eb72-4d81-baf8-b49f103a8ca0-logs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712065 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59cl\" (UniqueName: \"kubernetes.io/projected/0a1b0724-0e18-475b-9f9f-c96bf13e371a-kube-api-access-m59cl\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712091 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data-custom\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712113 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data-custom\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1b0724-0e18-475b-9f9f-c96bf13e371a-logs\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712178 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-combined-ca-bundle\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712198 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mglbq\" (UniqueName: \"kubernetes.io/projected/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-kube-api-access-mglbq\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712241 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fkd\" (UniqueName: \"kubernetes.io/projected/2e839961-eb72-4d81-baf8-b49f103a8ca0-kube-api-access-v5fkd\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712266 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-logs\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712337 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-scripts\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712361 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712392 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-config-data\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712421 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-public-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.712440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.716696 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1b0724-0e18-475b-9f9f-c96bf13e371a-logs\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.734889 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data-custom\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.740273 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-config-data\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.740825 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1b0724-0e18-475b-9f9f-c96bf13e371a-combined-ca-bundle\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.829656 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e839961-eb72-4d81-baf8-b49f103a8ca0-logs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.829768 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data-custom\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.829858 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mglbq\" (UniqueName: \"kubernetes.io/projected/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-kube-api-access-mglbq\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.829932 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fkd\" (UniqueName: \"kubernetes.io/projected/2e839961-eb72-4d81-baf8-b49f103a8ca0-kube-api-access-v5fkd\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.829976 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-logs\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-scripts\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830110 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830155 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-config-data\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830183 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-public-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830244 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-internal-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830284 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-combined-ca-bundle\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.830470 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-combined-ca-bundle\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.840704 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e839961-eb72-4d81-baf8-b49f103a8ca0-logs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.844952 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-combined-ca-bundle\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.845575 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-logs\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.857167 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-config-data\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.858915 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-combined-ca-bundle\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.870777 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-scripts\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.892472 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-internal-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.920223 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cb949947d-nv4s5"] Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.930252 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data-custom\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.931616 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-config-data\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.931937 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59cl\" (UniqueName: \"kubernetes.io/projected/0a1b0724-0e18-475b-9f9f-c96bf13e371a-kube-api-access-m59cl\") pod \"barbican-worker-5c84494675-5wvrl\" (UID: \"0a1b0724-0e18-475b-9f9f-c96bf13e371a\") " pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.935965 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fkd\" (UniqueName: \"kubernetes.io/projected/2e839961-eb72-4d81-baf8-b49f103a8ca0-kube-api-access-v5fkd\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.942936 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mglbq\" (UniqueName: \"kubernetes.io/projected/c874c7fc-ab63-41e8-8e5d-921aa5f09e9e-kube-api-access-mglbq\") pod \"barbican-keystone-listener-6cb949947d-nv4s5\" (UID: \"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e\") " pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.951986 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c84494675-5wvrl" Dec 06 05:59:53 crc kubenswrapper[4733]: I1206 05:59:53.978878 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.046579 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8696d9b56-5s4w8"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.047199 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c48ddf69-7gjfm"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.048993 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.057625 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c48ddf69-7gjfm"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.098931 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerStarted","Data":"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3"} Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.099069 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerStarted","Data":"f78c0a5db911f9b841b6c6f6186b6fc41b2b9e4bc3cd3dda8cb55d74a098abc0"} Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.107801 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.109624 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.119854 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vfkdm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.121184 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.121926 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vfkdm" event={"ID":"3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c","Type":"ContainerDied","Data":"06b4a2a7e0aefdf89b0f4e967d569de47fe05d467a94c0f5e5a6dac209a6ee13"} Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.121965 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b4a2a7e0aefdf89b0f4e967d569de47fe05d467a94c0f5e5a6dac209a6ee13" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.121988 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144232 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144559 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144638 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2ph\" (UniqueName: \"kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144735 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144835 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144863 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.144908 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.145003 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xbj\" (UniqueName: \"kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.145080 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.198998 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.211246 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.215381 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.219895 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.223193 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.223409 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.223757 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d8mpb" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.223895 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251130 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251214 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251240 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251283 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251358 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2ph\" (UniqueName: \"kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251404 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6pk\" (UniqueName: \"kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251447 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251498 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251535 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251554 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251600 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251614 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251632 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251714 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xbj\" (UniqueName: \"kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.251770 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.252646 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.253366 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.257919 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.260628 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.261618 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.263477 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.263917 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.273030 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.275831 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.280604 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c48ddf69-7gjfm"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.297806 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xbj\" (UniqueName: \"kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj\") pod \"dnsmasq-dns-57c48ddf69-7gjfm\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:54 crc kubenswrapper[4733]: E1206 05:59:54.299208 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-r6xbj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" podUID="48a5ff37-e968-4f0a-8281-b9c57b754ec9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.321114 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2ph\" (UniqueName: \"kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph\") pod \"barbican-api-6484ff4846-p58m9\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.328399 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.336094 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.339613 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361173 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361329 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6pk\" (UniqueName: \"kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361408 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361468 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361482 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361530 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.361713 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.369285 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.370185 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.372183 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.386121 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.398734 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6pk\" (UniqueName: \"kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk\") pod \"cinder-scheduler-0\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.462678 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463524 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463560 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463667 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463742 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64qfl\" (UniqueName: \"kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463901 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.463935 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.473222 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.474731 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.477602 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.529427 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d7ceb4-c6a6-407f-b208-693d62a8b76b" path="/var/lib/kubelet/pods/61d7ceb4-c6a6-407f-b208-693d62a8b76b/volumes" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.530098 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569354 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569400 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569441 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569462 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569498 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569518 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569560 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569635 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569661 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569675 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569697 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64qfl\" (UniqueName: \"kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569746 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.569804 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftv4s\" (UniqueName: \"kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.571391 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.571884 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.572850 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.573346 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.575039 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.575189 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.594540 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64qfl\" (UniqueName: \"kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl\") pod \"dnsmasq-dns-6f44678f55-nshc4\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671444 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671789 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671826 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671863 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftv4s\" (UniqueName: \"kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.671918 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.672637 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.674633 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.682413 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.682766 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.682782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.684406 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.692263 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.699801 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftv4s\" (UniqueName: \"kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s\") pod \"cinder-api-0\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.806592 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.828506 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.841372 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e839961-eb72-4d81-baf8-b49f103a8ca0-public-tls-certs\") pod \"placement-8696d9b56-5s4w8\" (UID: \"2e839961-eb72-4d81-baf8-b49f103a8ca0\") " pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.844647 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.874779 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.916881 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:54 crc kubenswrapper[4733]: W1206 05:59:54.975474 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a1b0724_0e18_475b_9f9f_c96bf13e371a.slice/crio-2566734be8d9e807e52a4b87bdcd234462eb89ed96a4085b93825d9b0a15c030 WatchSource:0}: Error finding container 2566734be8d9e807e52a4b87bdcd234462eb89ed96a4085b93825d9b0a15c030: Status 404 returned error can't find the container with id 2566734be8d9e807e52a4b87bdcd234462eb89ed96a4085b93825d9b0a15c030 Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.980399 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c84494675-5wvrl"] Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981311 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981357 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981460 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2hxq\" (UniqueName: \"kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981482 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981520 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvsq8\" (UniqueName: \"kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981642 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981693 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981722 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981744 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981789 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys\") pod \"6a886728-ea9b-485c-844d-964614315b0d\" (UID: \"6a886728-ea9b-485c-844d-964614315b0d\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981805 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.981889 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config\") pod \"af780532-6391-4d7f-93d4-fb966d5e0434\" (UID: \"af780532-6391-4d7f-93d4-fb966d5e0434\") " Dec 06 05:59:54 crc kubenswrapper[4733]: I1206 05:59:54.998484 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq" (OuterVolumeSpecName: "kube-api-access-x2hxq") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "kube-api-access-x2hxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.006449 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.008801 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8" (OuterVolumeSpecName: "kube-api-access-kvsq8") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "kube-api-access-kvsq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.009075 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.025038 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts" (OuterVolumeSpecName: "scripts") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.066112 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.066403 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085288 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085437 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2hxq\" (UniqueName: \"kubernetes.io/projected/af780532-6391-4d7f-93d4-fb966d5e0434-kube-api-access-x2hxq\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085496 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvsq8\" (UniqueName: \"kubernetes.io/projected/6a886728-ea9b-485c-844d-964614315b0d-kube-api-access-kvsq8\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085507 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085517 4733 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085526 4733 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.085535 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.100465 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.125183 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data" (OuterVolumeSpecName: "config-data") pod "6a886728-ea9b-485c-844d-964614315b0d" (UID: "6a886728-ea9b-485c-844d-964614315b0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.128724 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config" (OuterVolumeSpecName: "config") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.129444 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.146451 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af780532-6391-4d7f-93d4-fb966d5e0434" (UID: "af780532-6391-4d7f-93d4-fb966d5e0434"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.159125 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4twth" event={"ID":"6a886728-ea9b-485c-844d-964614315b0d","Type":"ContainerDied","Data":"db219d663b591d70eb8e58d2b83b123b68aae8961a80c2a2a4c37d858b022227"} Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.159160 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db219d663b591d70eb8e58d2b83b123b68aae8961a80c2a2a4c37d858b022227" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.159225 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4twth" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.160371 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cb949947d-nv4s5"] Dec 06 05:59:55 crc kubenswrapper[4733]: W1206 05:59:55.195974 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc874c7fc_ab63_41e8_8e5d_921aa5f09e9e.slice/crio-f90f14d0143370600942233559a281aa0f5ab93efb3d458b24a338c63909017c WatchSource:0}: Error finding container f90f14d0143370600942233559a281aa0f5ab93efb3d458b24a338c63909017c: Status 404 returned error can't find the container with id f90f14d0143370600942233559a281aa0f5ab93efb3d458b24a338c63909017c Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.198570 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.198716 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.198728 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.198737 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af780532-6391-4d7f-93d4-fb966d5e0434-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.198753 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a886728-ea9b-485c-844d-964614315b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.201814 4733 generic.go:334] "Generic (PLEG): container finished" podID="af780532-6391-4d7f-93d4-fb966d5e0434" containerID="58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb" exitCode=0 Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.202108 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" event={"ID":"af780532-6391-4d7f-93d4-fb966d5e0434","Type":"ContainerDied","Data":"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb"} Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.202172 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" event={"ID":"af780532-6391-4d7f-93d4-fb966d5e0434","Type":"ContainerDied","Data":"f38ff97edf3c3dde287bf672753b7071afffbe6d49f94a7b916062b39e9eea71"} Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.202196 4733 scope.go:117] "RemoveContainer" containerID="58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.206335 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c769b985f-kk46z" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.228206 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerStarted","Data":"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2"} Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.234831 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerStarted","Data":"2b0ce20663c6a658b0355e15034e57f7723b14fd4e5aa6eee46ca3382bd868c2"} Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.254539 4733 scope.go:117] "RemoveContainer" containerID="5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02" Dec 06 05:59:55 crc kubenswrapper[4733]: W1206 05:59:55.262262 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5cb876_e7a9_4f7f_8ab8_fda582a40261.slice/crio-bd657ac8e0d14ddfbc80c97d25e7451da5467280cb581efa4b9ff5bc96df273f WatchSource:0}: Error finding container bd657ac8e0d14ddfbc80c97d25e7451da5467280cb581efa4b9ff5bc96df273f: Status 404 returned error can't find the container with id bd657ac8e0d14ddfbc80c97d25e7451da5467280cb581efa4b9ff5bc96df273f Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.262556 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.262770 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c84494675-5wvrl" event={"ID":"0a1b0724-0e18-475b-9f9f-c96bf13e371a","Type":"ContainerStarted","Data":"2566734be8d9e807e52a4b87bdcd234462eb89ed96a4085b93825d9b0a15c030"} Dec 06 05:59:55 crc kubenswrapper[4733]: W1206 05:59:55.277110 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0072aca6_dd00_419a_afa2_690fcc9712f7.slice/crio-8f7c3b434e7af52283e61b64ba5fa131c0bb12d38aac464c9f756956df4f77d2 WatchSource:0}: Error finding container 8f7c3b434e7af52283e61b64ba5fa131c0bb12d38aac464c9f756956df4f77d2: Status 404 returned error can't find the container with id 8f7c3b434e7af52283e61b64ba5fa131c0bb12d38aac464c9f756956df4f77d2 Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.289774 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.294380 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.303961 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.314511 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.325473 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.325455585 podStartE2EDuration="5.325455585s" podCreationTimestamp="2025-12-06 05:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:55.277819928 +0000 UTC m=+979.143031040" watchObservedRunningTime="2025-12-06 05:59:55.325455585 +0000 UTC m=+979.190666696" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.334692 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.340325 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c769b985f-kk46z"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.368642 4733 scope.go:117] "RemoveContainer" containerID="58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb" Dec 06 05:59:55 crc kubenswrapper[4733]: E1206 05:59:55.370808 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb\": container with ID starting with 58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb not found: ID does not exist" containerID="58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.370942 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb"} err="failed to get container status \"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb\": rpc error: code = NotFound desc = could not find container \"58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb\": container with ID starting with 58f4d631151449a9097eac46cc919892a61cf8d2df5c179d8d7d15516896c2cb not found: ID does not exist" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.371025 4733 scope.go:117] "RemoveContainer" containerID="5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02" Dec 06 05:59:55 crc kubenswrapper[4733]: E1206 05:59:55.372347 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02\": container with ID starting with 5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02 not found: ID does not exist" containerID="5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.372510 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02"} err="failed to get container status \"5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02\": rpc error: code = NotFound desc = could not find container \"5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02\": container with ID starting with 5bec6fc6e2d2d2a6862b47fcefbe04567629145567dfd5a8f3621cf8ee008b02 not found: ID does not exist" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409036 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409145 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409323 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xbj\" (UniqueName: \"kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409443 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409464 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.409495 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0\") pod \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\" (UID: \"48a5ff37-e968-4f0a-8281-b9c57b754ec9\") " Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.411123 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config" (OuterVolumeSpecName: "config") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.411482 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.411791 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.411926 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.412387 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.414823 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj" (OuterVolumeSpecName: "kube-api-access-r6xbj") pod "48a5ff37-e968-4f0a-8281-b9c57b754ec9" (UID: "48a5ff37-e968-4f0a-8281-b9c57b754ec9"). InnerVolumeSpecName "kube-api-access-r6xbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512362 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512392 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-config\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512404 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512413 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512434 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48a5ff37-e968-4f0a-8281-b9c57b754ec9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.512443 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6xbj\" (UniqueName: \"kubernetes.io/projected/48a5ff37-e968-4f0a-8281-b9c57b754ec9-kube-api-access-r6xbj\") on node \"crc\" DevicePath \"\"" Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.571406 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8696d9b56-5s4w8"] Dec 06 05:59:55 crc kubenswrapper[4733]: I1206 05:59:55.608459 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 05:59:55 crc kubenswrapper[4733]: W1206 05:59:55.662436 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e839961_eb72_4d81_baf8_b49f103a8ca0.slice/crio-17a2414d03156d7a2ed7cfc590051f14de6092cdab20babd5e4791251c4ea776 WatchSource:0}: Error finding container 17a2414d03156d7a2ed7cfc590051f14de6092cdab20babd5e4791251c4ea776: Status 404 returned error can't find the container with id 17a2414d03156d7a2ed7cfc590051f14de6092cdab20babd5e4791251c4ea776 Dec 06 05:59:55 crc kubenswrapper[4733]: W1206 05:59:55.664798 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edecbb2_ca0e_46f0_b142_19aaf7aa25ca.slice/crio-93443507e189a2cd4cda7583e89efd0925b125f79663d977dddd14d3837f8ab0 WatchSource:0}: Error finding container 93443507e189a2cd4cda7583e89efd0925b125f79663d977dddd14d3837f8ab0: Status 404 returned error can't find the container with id 93443507e189a2cd4cda7583e89efd0925b125f79663d977dddd14d3837f8ab0 Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.037703 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fff9b86f5-qw8vr"] Dec 06 05:59:56 crc kubenswrapper[4733]: E1206 05:59:56.038395 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="init" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.038411 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="init" Dec 06 05:59:56 crc kubenswrapper[4733]: E1206 05:59:56.038441 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a886728-ea9b-485c-844d-964614315b0d" containerName="keystone-bootstrap" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.038449 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a886728-ea9b-485c-844d-964614315b0d" containerName="keystone-bootstrap" Dec 06 05:59:56 crc kubenswrapper[4733]: E1206 05:59:56.038462 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="dnsmasq-dns" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.038471 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="dnsmasq-dns" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.041365 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a886728-ea9b-485c-844d-964614315b0d" containerName="keystone-bootstrap" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.041407 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" containerName="dnsmasq-dns" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.042038 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.046413 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.046655 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.047106 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.047168 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d5nb6" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.047342 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.050466 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.052645 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fff9b86f5-qw8vr"] Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142528 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-credential-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142604 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-public-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142625 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-internal-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142659 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-scripts\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142691 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-combined-ca-bundle\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142720 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkmf\" (UniqueName: \"kubernetes.io/projected/4dfea320-4713-41d2-8d4a-ca371c346e9a-kube-api-access-pfkmf\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142734 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-fernet-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.142794 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-config-data\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.244603 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-credential-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.244972 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-internal-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.244994 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-public-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.245029 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-scripts\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.245062 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-combined-ca-bundle\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.245101 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkmf\" (UniqueName: \"kubernetes.io/projected/4dfea320-4713-41d2-8d4a-ca371c346e9a-kube-api-access-pfkmf\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.245117 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-fernet-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.245159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-config-data\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.250227 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-fernet-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.250253 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-config-data\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.250621 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-scripts\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.251316 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-combined-ca-bundle\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.251909 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-internal-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.252467 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-credential-keys\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.254010 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dfea320-4713-41d2-8d4a-ca371c346e9a-public-tls-certs\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.259834 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkmf\" (UniqueName: \"kubernetes.io/projected/4dfea320-4713-41d2-8d4a-ca371c346e9a-kube-api-access-pfkmf\") pod \"keystone-fff9b86f5-qw8vr\" (UID: \"4dfea320-4713-41d2-8d4a-ca371c346e9a\") " pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.283046 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerStarted","Data":"584852523d5d205e570e24ef9ab94df6ea3bc964b7750689522895caac2f9eb1"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.283096 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerStarted","Data":"495e4e31d32d088c287120beeba2fca7f8c1caa0e3ca134c8216be79a4714852"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.285397 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerStarted","Data":"bd657ac8e0d14ddfbc80c97d25e7451da5467280cb581efa4b9ff5bc96df273f"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.292916 4733 generic.go:334] "Generic (PLEG): container finished" podID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerID="f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697" exitCode=0 Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.293053 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" event={"ID":"75df2e76-18b2-4bb7-8069-7636be9b1e46","Type":"ContainerDied","Data":"f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.293101 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" event={"ID":"75df2e76-18b2-4bb7-8069-7636be9b1e46","Type":"ContainerStarted","Data":"840cec72a42f5c9037cb98096adba8d7f53e23bdcb9d7c51bb67ae9433aea7de"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.297853 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerStarted","Data":"93443507e189a2cd4cda7583e89efd0925b125f79663d977dddd14d3837f8ab0"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.300350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8696d9b56-5s4w8" event={"ID":"2e839961-eb72-4d81-baf8-b49f103a8ca0","Type":"ContainerStarted","Data":"17a2414d03156d7a2ed7cfc590051f14de6092cdab20babd5e4791251c4ea776"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.304024 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerStarted","Data":"03182fa81abaa155f96533f983619d5a70025b6016e941c60c2e51362d697ef5"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.304059 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerStarted","Data":"3a88c137d7742462b8db5a5e1b34f51bbef6f313ea2179f039d50a9d1d731e67"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.304074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerStarted","Data":"8f7c3b434e7af52283e61b64ba5fa131c0bb12d38aac464c9f756956df4f77d2"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.304684 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.304715 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.307746 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" event={"ID":"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e","Type":"ContainerStarted","Data":"f90f14d0143370600942233559a281aa0f5ab93efb3d458b24a338c63909017c"} Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.311690 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c48ddf69-7gjfm" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.322038 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.322020257 podStartE2EDuration="4.322020257s" podCreationTimestamp="2025-12-06 05:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:56.303734332 +0000 UTC m=+980.168945443" watchObservedRunningTime="2025-12-06 05:59:56.322020257 +0000 UTC m=+980.187231368" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.361036 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6484ff4846-p58m9" podStartSLOduration=3.361019234 podStartE2EDuration="3.361019234s" podCreationTimestamp="2025-12-06 05:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 05:59:56.34221345 +0000 UTC m=+980.207424560" watchObservedRunningTime="2025-12-06 05:59:56.361019234 +0000 UTC m=+980.226230345" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.371028 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.376616 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c48ddf69-7gjfm"] Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.381141 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c48ddf69-7gjfm"] Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.509199 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a5ff37-e968-4f0a-8281-b9c57b754ec9" path="/var/lib/kubelet/pods/48a5ff37-e968-4f0a-8281-b9c57b754ec9/volumes" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.510493 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af780532-6391-4d7f-93d4-fb966d5e0434" path="/var/lib/kubelet/pods/af780532-6391-4d7f-93d4-fb966d5e0434/volumes" Dec 06 05:59:56 crc kubenswrapper[4733]: I1206 05:59:56.636865 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 05:59:58 crc kubenswrapper[4733]: E1206 05:59:58.113932 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 05:59:58 crc kubenswrapper[4733]: I1206 05:59:58.347975 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerStarted","Data":"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd"} Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.779487 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7bf5d6f884-5w5rz"] Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.781612 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.792488 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.792607 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.805844 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7bf5d6f884-5w5rz"] Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821066 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126b09fd-ddf0-4e25-bfab-28f73ca04e50-logs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821117 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-combined-ca-bundle\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821159 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-public-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821177 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821240 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf2gt\" (UniqueName: \"kubernetes.io/projected/126b09fd-ddf0-4e25-bfab-28f73ca04e50-kube-api-access-nf2gt\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821287 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data-custom\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.821329 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-internal-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.922867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf2gt\" (UniqueName: \"kubernetes.io/projected/126b09fd-ddf0-4e25-bfab-28f73ca04e50-kube-api-access-nf2gt\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.922935 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data-custom\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.922955 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-internal-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.923023 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126b09fd-ddf0-4e25-bfab-28f73ca04e50-logs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.923056 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-combined-ca-bundle\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.923097 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-public-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.923119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.925055 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126b09fd-ddf0-4e25-bfab-28f73ca04e50-logs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.932025 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data-custom\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.932536 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-internal-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.935038 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-public-tls-certs\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.935639 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-combined-ca-bundle\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.942364 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126b09fd-ddf0-4e25-bfab-28f73ca04e50-config-data\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 05:59:59 crc kubenswrapper[4733]: I1206 05:59:59.945199 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf2gt\" (UniqueName: \"kubernetes.io/projected/126b09fd-ddf0-4e25-bfab-28f73ca04e50-kube-api-access-nf2gt\") pod \"barbican-api-7bf5d6f884-5w5rz\" (UID: \"126b09fd-ddf0-4e25-bfab-28f73ca04e50\") " pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.114908 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.166893 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q"] Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.168885 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.172174 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.174346 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.188115 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q"] Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.331734 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8f6l\" (UniqueName: \"kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.331780 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.331861 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.377507 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8696d9b56-5s4w8" event={"ID":"2e839961-eb72-4d81-baf8-b49f103a8ca0","Type":"ContainerStarted","Data":"cb3713358625b836cf67463b86fbd86f54b218104557f8e576264e8b2785c525"} Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.425477 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fff9b86f5-qw8vr"] Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.433747 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8f6l\" (UniqueName: \"kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.433814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.433981 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.434868 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.438527 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.447239 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8f6l\" (UniqueName: \"kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l\") pod \"collect-profiles-29416680-cqp6q\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: W1206 06:00:00.465770 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dfea320_4713_41d2_8d4a_ca371c346e9a.slice/crio-d81380daf1818eec9a25b14dcc9d1c47d1dbdb6156c0d072d1b6d3ca062a4b25 WatchSource:0}: Error finding container d81380daf1818eec9a25b14dcc9d1c47d1dbdb6156c0d072d1b6d3ca062a4b25: Status 404 returned error can't find the container with id d81380daf1818eec9a25b14dcc9d1c47d1dbdb6156c0d072d1b6d3ca062a4b25 Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.493529 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.899198 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7bf5d6f884-5w5rz"] Dec 06 06:00:00 crc kubenswrapper[4733]: W1206 06:00:00.923756 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126b09fd_ddf0_4e25_bfab_28f73ca04e50.slice/crio-3fc07605abfc6b8acc274e181d4f5048a3afec974b48bb83eddee97e4f725f04 WatchSource:0}: Error finding container 3fc07605abfc6b8acc274e181d4f5048a3afec974b48bb83eddee97e4f725f04: Status 404 returned error can't find the container with id 3fc07605abfc6b8acc274e181d4f5048a3afec974b48bb83eddee97e4f725f04 Dec 06 06:00:00 crc kubenswrapper[4733]: I1206 06:00:00.987817 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q"] Dec 06 06:00:01 crc kubenswrapper[4733]: W1206 06:00:01.001321 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9b14ad_76ee_43dc_b948_28abf700d584.slice/crio-d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261 WatchSource:0}: Error finding container d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261: Status 404 returned error can't find the container with id d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261 Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.241037 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.241079 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.301391 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.328841 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.407258 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c84494675-5wvrl" event={"ID":"0a1b0724-0e18-475b-9f9f-c96bf13e371a","Type":"ContainerStarted","Data":"de777c933a3b1d316712b9dcf256f6de4016a8a10c8a54aa15b5cb8a0b1a8958"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.407562 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c84494675-5wvrl" event={"ID":"0a1b0724-0e18-475b-9f9f-c96bf13e371a","Type":"ContainerStarted","Data":"b34c6a1334b32f56ed01160b297326c4653a20730b04ea008304b7810e890d2e"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.416639 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8696d9b56-5s4w8" event={"ID":"2e839961-eb72-4d81-baf8-b49f103a8ca0","Type":"ContainerStarted","Data":"a44a3b087a5daaf284a7556b4a4ec3f2172ca62e23a63397ab3528b5e5620ee8"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.416706 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.416870 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.426212 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bf5d6f884-5w5rz" event={"ID":"126b09fd-ddf0-4e25-bfab-28f73ca04e50","Type":"ContainerStarted","Data":"9f0d44f6207bf01bdaaa4c0d8017338bf4e7e05d4006b62353d83cdd6b5c76b1"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.426241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bf5d6f884-5w5rz" event={"ID":"126b09fd-ddf0-4e25-bfab-28f73ca04e50","Type":"ContainerStarted","Data":"3fc07605abfc6b8acc274e181d4f5048a3afec974b48bb83eddee97e4f725f04"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.437500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" event={"ID":"3e9b14ad-76ee-43dc-b948-28abf700d584","Type":"ContainerStarted","Data":"50537d06f9cce557c4894c5d5829e780c4e32fde7833bde64bfb594171246825"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.437539 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" event={"ID":"3e9b14ad-76ee-43dc-b948-28abf700d584","Type":"ContainerStarted","Data":"d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.441516 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5c84494675-5wvrl" podStartSLOduration=3.099522702 podStartE2EDuration="8.441496727s" podCreationTimestamp="2025-12-06 05:59:53 +0000 UTC" firstStartedPulling="2025-12-06 05:59:54.987573735 +0000 UTC m=+978.852784846" lastFinishedPulling="2025-12-06 06:00:00.329547761 +0000 UTC m=+984.194758871" observedRunningTime="2025-12-06 06:00:01.433930689 +0000 UTC m=+985.299141791" watchObservedRunningTime="2025-12-06 06:00:01.441496727 +0000 UTC m=+985.306707837" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.442869 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerStarted","Data":"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.447269 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" event={"ID":"75df2e76-18b2-4bb7-8069-7636be9b1e46","Type":"ContainerStarted","Data":"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.447811 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.457159 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerStarted","Data":"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.457292 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api-log" containerID="cri-o://485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd" gracePeriod=30 Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.457549 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.457592 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api" containerID="cri-o://cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc" gracePeriod=30 Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.464057 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fff9b86f5-qw8vr" event={"ID":"4dfea320-4713-41d2-8d4a-ca371c346e9a","Type":"ContainerStarted","Data":"0ed73666a175ead0361533520108fba74bf7baba65ad8c0be17d7215018ec830"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.464173 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fff9b86f5-qw8vr" event={"ID":"4dfea320-4713-41d2-8d4a-ca371c346e9a","Type":"ContainerStarted","Data":"d81380daf1818eec9a25b14dcc9d1c47d1dbdb6156c0d072d1b6d3ca062a4b25"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.465089 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.475134 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" event={"ID":"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e","Type":"ContainerStarted","Data":"7dfe71ad9aa25884f9f8fb874ed5f73cf78ebac835e7ba8984214839ee9099c2"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.475178 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" event={"ID":"c874c7fc-ab63-41e8-8e5d-921aa5f09e9e","Type":"ContainerStarted","Data":"7642f5923a0d0b674ee3ed59c8c1983a3de6ace4c570b8f61db217c411cb1428"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.478922 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8696d9b56-5s4w8" podStartSLOduration=8.47890963 podStartE2EDuration="8.47890963s" podCreationTimestamp="2025-12-06 05:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:01.457692092 +0000 UTC m=+985.322903203" watchObservedRunningTime="2025-12-06 06:00:01.47890963 +0000 UTC m=+985.344120742" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.483688 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.483677456 podStartE2EDuration="7.483677456s" podCreationTimestamp="2025-12-06 05:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:01.47341101 +0000 UTC m=+985.338622121" watchObservedRunningTime="2025-12-06 06:00:01.483677456 +0000 UTC m=+985.348888567" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.489395 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerStarted","Data":"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee"} Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.491065 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.491091 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.519498 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" podStartSLOduration=7.519484248 podStartE2EDuration="7.519484248s" podCreationTimestamp="2025-12-06 05:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:01.500620236 +0000 UTC m=+985.365831347" watchObservedRunningTime="2025-12-06 06:00:01.519484248 +0000 UTC m=+985.384695359" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.520106 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" podStartSLOduration=1.520100808 podStartE2EDuration="1.520100808s" podCreationTimestamp="2025-12-06 06:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:01.515314418 +0000 UTC m=+985.380525529" watchObservedRunningTime="2025-12-06 06:00:01.520100808 +0000 UTC m=+985.385311920" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.604354 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6cb949947d-nv4s5" podStartSLOduration=3.329178107 podStartE2EDuration="8.604327632s" podCreationTimestamp="2025-12-06 05:59:53 +0000 UTC" firstStartedPulling="2025-12-06 05:59:55.209178142 +0000 UTC m=+979.074389253" lastFinishedPulling="2025-12-06 06:00:00.484327666 +0000 UTC m=+984.349538778" observedRunningTime="2025-12-06 06:00:01.54677633 +0000 UTC m=+985.411987442" watchObservedRunningTime="2025-12-06 06:00:01.604327632 +0000 UTC m=+985.469538743" Dec 06 06:00:01 crc kubenswrapper[4733]: I1206 06:00:01.607879 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fff9b86f5-qw8vr" podStartSLOduration=5.607853423 podStartE2EDuration="5.607853423s" podCreationTimestamp="2025-12-06 05:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:01.578052503 +0000 UTC m=+985.443263604" watchObservedRunningTime="2025-12-06 06:00:01.607853423 +0000 UTC m=+985.473064534" Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.026434 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6484ff4846-p58m9" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.499016 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerStarted","Data":"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301"} Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.502197 4733 generic.go:334] "Generic (PLEG): container finished" podID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerID="485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd" exitCode=143 Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.502249 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerDied","Data":"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd"} Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.507154 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bf5d6f884-5w5rz" event={"ID":"126b09fd-ddf0-4e25-bfab-28f73ca04e50","Type":"ContainerStarted","Data":"004a43110bd313ceec94ed816d650f8932cfaab2b26b0ac3810b20bc61ccf906"} Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.507295 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.511202 4733 generic.go:334] "Generic (PLEG): container finished" podID="3e9b14ad-76ee-43dc-b948-28abf700d584" containerID="50537d06f9cce557c4894c5d5829e780c4e32fde7833bde64bfb594171246825" exitCode=0 Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.511317 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" event={"ID":"3e9b14ad-76ee-43dc-b948-28abf700d584","Type":"ContainerDied","Data":"50537d06f9cce557c4894c5d5829e780c4e32fde7833bde64bfb594171246825"} Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.515263 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.295597844 podStartE2EDuration="8.515251812s" podCreationTimestamp="2025-12-06 05:59:54 +0000 UTC" firstStartedPulling="2025-12-06 05:59:55.277196707 +0000 UTC m=+979.142407818" lastFinishedPulling="2025-12-06 06:00:00.496850675 +0000 UTC m=+984.362061786" observedRunningTime="2025-12-06 06:00:02.512793509 +0000 UTC m=+986.378004610" watchObservedRunningTime="2025-12-06 06:00:02.515251812 +0000 UTC m=+986.380462923" Dec 06 06:00:02 crc kubenswrapper[4733]: I1206 06:00:02.544402 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7bf5d6f884-5w5rz" podStartSLOduration=3.544393161 podStartE2EDuration="3.544393161s" podCreationTimestamp="2025-12-06 05:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:02.541181371 +0000 UTC m=+986.406392482" watchObservedRunningTime="2025-12-06 06:00:02.544393161 +0000 UTC m=+986.409604272" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.128433 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.129223 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.289594 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.289642 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.322728 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.334042 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.523182 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.523224 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.523234 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.877680 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.934428 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8f6l\" (UniqueName: \"kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l\") pod \"3e9b14ad-76ee-43dc-b948-28abf700d584\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.934572 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume\") pod \"3e9b14ad-76ee-43dc-b948-28abf700d584\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.934664 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume\") pod \"3e9b14ad-76ee-43dc-b948-28abf700d584\" (UID: \"3e9b14ad-76ee-43dc-b948-28abf700d584\") " Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.938329 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e9b14ad-76ee-43dc-b948-28abf700d584" (UID: "3e9b14ad-76ee-43dc-b948-28abf700d584"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.943694 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e9b14ad-76ee-43dc-b948-28abf700d584" (UID: "3e9b14ad-76ee-43dc-b948-28abf700d584"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:03 crc kubenswrapper[4733]: I1206 06:00:03.948631 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l" (OuterVolumeSpecName: "kube-api-access-g8f6l") pod "3e9b14ad-76ee-43dc-b948-28abf700d584" (UID: "3e9b14ad-76ee-43dc-b948-28abf700d584"). InnerVolumeSpecName "kube-api-access-g8f6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.037324 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8f6l\" (UniqueName: \"kubernetes.io/projected/3e9b14ad-76ee-43dc-b948-28abf700d584-kube-api-access-g8f6l\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.037353 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9b14ad-76ee-43dc-b948-28abf700d584-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.037362 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9b14ad-76ee-43dc-b948-28abf700d584-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.553632 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" event={"ID":"3e9b14ad-76ee-43dc-b948-28abf700d584","Type":"ContainerDied","Data":"d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261"} Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.555044 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3f0c0ff813e4834e84a566c106e63061cbf21de42bebe5154915a90a6a4e261" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.555209 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q" Dec 06 06:00:04 crc kubenswrapper[4733]: I1206 06:00:04.575789 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 06:00:05 crc kubenswrapper[4733]: I1206 06:00:05.153871 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:05 crc kubenswrapper[4733]: I1206 06:00:05.170735 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 06:00:05 crc kubenswrapper[4733]: I1206 06:00:05.623165 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 06:00:05 crc kubenswrapper[4733]: I1206 06:00:05.784795 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 06:00:08 crc kubenswrapper[4733]: E1206 06:00:08.338072 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 06:00:09 crc kubenswrapper[4733]: I1206 06:00:09.686495 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 06:00:09 crc kubenswrapper[4733]: I1206 06:00:09.739820 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 06:00:09 crc kubenswrapper[4733]: I1206 06:00:09.740059 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="dnsmasq-dns" containerID="cri-o://06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067" gracePeriod=10 Dec 06 06:00:09 crc kubenswrapper[4733]: I1206 06:00:09.838659 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 06:00:09 crc kubenswrapper[4733]: I1206 06:00:09.895159 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.188721 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.274981 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.275099 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.275156 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8g2x\" (UniqueName: \"kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.275250 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.275326 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.275400 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc\") pod \"87d7b344-b58b-459a-89b3-7f09319c5a73\" (UID: \"87d7b344-b58b-459a-89b3-7f09319c5a73\") " Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.293103 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x" (OuterVolumeSpecName: "kube-api-access-r8g2x") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "kube-api-access-r8g2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.319332 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.324107 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.328752 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.329095 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config" (OuterVolumeSpecName: "config") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.333682 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87d7b344-b58b-459a-89b3-7f09319c5a73" (UID: "87d7b344-b58b-459a-89b3-7f09319c5a73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.374017 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-544f978b4d-7s676" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383808 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383837 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383846 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383855 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383862 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8g2x\" (UniqueName: \"kubernetes.io/projected/87d7b344-b58b-459a-89b3-7f09319c5a73-kube-api-access-r8g2x\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.383870 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d7b344-b58b-459a-89b3-7f09319c5a73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.613580 4733 generic.go:334] "Generic (PLEG): container finished" podID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerID="06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067" exitCode=0 Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.613624 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" event={"ID":"87d7b344-b58b-459a-89b3-7f09319c5a73","Type":"ContainerDied","Data":"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067"} Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.613679 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.614068 4733 scope.go:117] "RemoveContainer" containerID="06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.614046 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f467bc5-7chvb" event={"ID":"87d7b344-b58b-459a-89b3-7f09319c5a73","Type":"ContainerDied","Data":"a549474bacfbcd0abcfdef93e22a1b04ec51d95f421db8add5ebf223e1c6de44"} Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.614386 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="cinder-scheduler" containerID="cri-o://da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084" gracePeriod=30 Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.614442 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="probe" containerID="cri-o://a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301" gracePeriod=30 Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.644468 4733 scope.go:117] "RemoveContainer" containerID="e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.645903 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.656769 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779f467bc5-7chvb"] Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.666840 4733 scope.go:117] "RemoveContainer" containerID="06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067" Dec 06 06:00:10 crc kubenswrapper[4733]: E1206 06:00:10.667165 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067\": container with ID starting with 06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067 not found: ID does not exist" containerID="06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.667261 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067"} err="failed to get container status \"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067\": rpc error: code = NotFound desc = could not find container \"06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067\": container with ID starting with 06b6e1c69b05a774cbc225126d6dd6c5233dc87cedf303d08b824602e2807067 not found: ID does not exist" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.667357 4733 scope.go:117] "RemoveContainer" containerID="e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f" Dec 06 06:00:10 crc kubenswrapper[4733]: E1206 06:00:10.667678 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f\": container with ID starting with e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f not found: ID does not exist" containerID="e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f" Dec 06 06:00:10 crc kubenswrapper[4733]: I1206 06:00:10.667754 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f"} err="failed to get container status \"e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f\": rpc error: code = NotFound desc = could not find container \"e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f\": container with ID starting with e87b0ac08db170bd3e2b5c1656644187d0c0de7faab6f4116961babd726a799f not found: ID does not exist" Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.324856 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.334373 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7bf5d6f884-5w5rz" Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.394403 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.394633 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6484ff4846-p58m9" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api-log" containerID="cri-o://3a88c137d7742462b8db5a5e1b34f51bbef6f313ea2179f039d50a9d1d731e67" gracePeriod=30 Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.395007 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6484ff4846-p58m9" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" containerID="cri-o://03182fa81abaa155f96533f983619d5a70025b6016e941c60c2e51362d697ef5" gracePeriod=30 Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.622769 4733 generic.go:334] "Generic (PLEG): container finished" podID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerID="a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301" exitCode=0 Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.622834 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerDied","Data":"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301"} Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.625007 4733 generic.go:334] "Generic (PLEG): container finished" podID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerID="3a88c137d7742462b8db5a5e1b34f51bbef6f313ea2179f039d50a9d1d731e67" exitCode=143 Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.625080 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerDied","Data":"3a88c137d7742462b8db5a5e1b34f51bbef6f313ea2179f039d50a9d1d731e67"} Dec 06 06:00:11 crc kubenswrapper[4733]: I1206 06:00:11.664107 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.256838 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.329125 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6pk\" (UniqueName: \"kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330040 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330160 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330275 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330408 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330103 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.330680 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle\") pod \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\" (UID: \"fc5cb876-e7a9-4f7f-8ab8-fda582a40261\") " Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.331188 4733 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.338473 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts" (OuterVolumeSpecName: "scripts") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.339362 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk" (OuterVolumeSpecName: "kube-api-access-4c6pk") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "kube-api-access-4c6pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.340757 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.396264 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.432918 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6pk\" (UniqueName: \"kubernetes.io/projected/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-kube-api-access-4c6pk\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.433031 4733 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.433129 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.433192 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.435381 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data" (OuterVolumeSpecName: "config-data") pod "fc5cb876-e7a9-4f7f-8ab8-fda582a40261" (UID: "fc5cb876-e7a9-4f7f-8ab8-fda582a40261"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.494195 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" path="/var/lib/kubelet/pods/87d7b344-b58b-459a-89b3-7f09319c5a73/volumes" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.535609 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5cb876-e7a9-4f7f-8ab8-fda582a40261-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.639323 4733 generic.go:334] "Generic (PLEG): container finished" podID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerID="da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084" exitCode=0 Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.639358 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerDied","Data":"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084"} Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.639414 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc5cb876-e7a9-4f7f-8ab8-fda582a40261","Type":"ContainerDied","Data":"bd657ac8e0d14ddfbc80c97d25e7451da5467280cb581efa4b9ff5bc96df273f"} Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.639454 4733 scope.go:117] "RemoveContainer" containerID="a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.639454 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.660372 4733 scope.go:117] "RemoveContainer" containerID="da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.664537 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.672078 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.683592 4733 scope.go:117] "RemoveContainer" containerID="a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.683996 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301\": container with ID starting with a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301 not found: ID does not exist" containerID="a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.684028 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301"} err="failed to get container status \"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301\": rpc error: code = NotFound desc = could not find container \"a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301\": container with ID starting with a16d648f280a5cca172151a546e67413119ffec788c640a4fd7d98d0a1ecd301 not found: ID does not exist" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.684051 4733 scope.go:117] "RemoveContainer" containerID="da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.684436 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084\": container with ID starting with da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084 not found: ID does not exist" containerID="da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.684475 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084"} err="failed to get container status \"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084\": rpc error: code = NotFound desc = could not find container \"da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084\": container with ID starting with da4140a0f4f2471d1286539796109993ea468d8334a17e2ab6693e5a0663f084 not found: ID does not exist" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.691552 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.691940 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="cinder-scheduler" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.691959 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="cinder-scheduler" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.691976 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="dnsmasq-dns" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.691983 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="dnsmasq-dns" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.692008 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9b14ad-76ee-43dc-b948-28abf700d584" containerName="collect-profiles" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692014 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9b14ad-76ee-43dc-b948-28abf700d584" containerName="collect-profiles" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.692032 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="probe" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692040 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="probe" Dec 06 06:00:12 crc kubenswrapper[4733]: E1206 06:00:12.692050 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="init" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692056 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="init" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692223 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d7b344-b58b-459a-89b3-7f09319c5a73" containerName="dnsmasq-dns" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692250 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9b14ad-76ee-43dc-b948-28abf700d584" containerName="collect-profiles" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692270 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="probe" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.692277 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" containerName="cinder-scheduler" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.693268 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.694956 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.708920 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.846791 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a870eae1-25fa-4c68-824e-e14fcd1e98ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.846974 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.846998 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.847103 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz49\" (UniqueName: \"kubernetes.io/projected/a870eae1-25fa-4c68-824e-e14fcd1e98ec-kube-api-access-8jz49\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.847129 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.847195 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.948506 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.948549 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.948627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jz49\" (UniqueName: \"kubernetes.io/projected/a870eae1-25fa-4c68-824e-e14fcd1e98ec-kube-api-access-8jz49\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.949116 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.949170 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.949201 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a870eae1-25fa-4c68-824e-e14fcd1e98ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.949292 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a870eae1-25fa-4c68-824e-e14fcd1e98ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.953414 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.953746 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.954656 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.955262 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a870eae1-25fa-4c68-824e-e14fcd1e98ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.963368 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jz49\" (UniqueName: \"kubernetes.io/projected/a870eae1-25fa-4c68-824e-e14fcd1e98ec-kube-api-access-8jz49\") pod \"cinder-scheduler-0\" (UID: \"a870eae1-25fa-4c68-824e-e14fcd1e98ec\") " pod="openstack/cinder-scheduler-0" Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.989738 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:00:12 crc kubenswrapper[4733]: I1206 06:00:12.989796 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.010132 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.454044 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.653643 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a870eae1-25fa-4c68-824e-e14fcd1e98ec","Type":"ContainerStarted","Data":"ae45c7d78a2cba9fe3ea56b849e85fcd270e35c6d0bb30f970f99ef458481cb7"} Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.742864 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7dbbf764c5-qntcx" Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.807607 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.807875 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-544f978b4d-7s676" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-api" containerID="cri-o://15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f" gracePeriod=30 Dec 06 06:00:13 crc kubenswrapper[4733]: I1206 06:00:13.808013 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-544f978b4d-7s676" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-httpd" containerID="cri-o://aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3" gracePeriod=30 Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.499031 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5cb876-e7a9-4f7f-8ab8-fda582a40261" path="/var/lib/kubelet/pods/fc5cb876-e7a9-4f7f-8ab8-fda582a40261/volumes" Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.552653 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6484ff4846-p58m9" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:56642->10.217.0.153:9311: read: connection reset by peer" Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.552682 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6484ff4846-p58m9" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:56656->10.217.0.153:9311: read: connection reset by peer" Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.666738 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a870eae1-25fa-4c68-824e-e14fcd1e98ec","Type":"ContainerStarted","Data":"cd71880752119d22c6c1fabb726531cb9e2e0a4d85c1a3795814ff7fcb0e715b"} Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.667010 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a870eae1-25fa-4c68-824e-e14fcd1e98ec","Type":"ContainerStarted","Data":"1ad1ad678413c385efa7ad073b370145149d268155494383c84ada9cee738aad"} Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.670092 4733 generic.go:334] "Generic (PLEG): container finished" podID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerID="03182fa81abaa155f96533f983619d5a70025b6016e941c60c2e51362d697ef5" exitCode=0 Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.670145 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerDied","Data":"03182fa81abaa155f96533f983619d5a70025b6016e941c60c2e51362d697ef5"} Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.671998 4733 generic.go:334] "Generic (PLEG): container finished" podID="2552f10d-828b-4996-a292-32499f7d24cf" containerID="aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3" exitCode=0 Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.672031 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerDied","Data":"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3"} Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.698030 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.697995626 podStartE2EDuration="2.697995626s" podCreationTimestamp="2025-12-06 06:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:14.690614295 +0000 UTC m=+998.555825406" watchObservedRunningTime="2025-12-06 06:00:14.697995626 +0000 UTC m=+998.563206737" Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.891605 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.997671 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs\") pod \"0072aca6-dd00-419a-afa2-690fcc9712f7\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.997725 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w2ph\" (UniqueName: \"kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph\") pod \"0072aca6-dd00-419a-afa2-690fcc9712f7\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.997846 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data\") pod \"0072aca6-dd00-419a-afa2-690fcc9712f7\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.997929 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle\") pod \"0072aca6-dd00-419a-afa2-690fcc9712f7\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " Dec 06 06:00:14 crc kubenswrapper[4733]: I1206 06:00:14.997977 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom\") pod \"0072aca6-dd00-419a-afa2-690fcc9712f7\" (UID: \"0072aca6-dd00-419a-afa2-690fcc9712f7\") " Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.000022 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs" (OuterVolumeSpecName: "logs") pod "0072aca6-dd00-419a-afa2-690fcc9712f7" (UID: "0072aca6-dd00-419a-afa2-690fcc9712f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.006453 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0072aca6-dd00-419a-afa2-690fcc9712f7" (UID: "0072aca6-dd00-419a-afa2-690fcc9712f7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.012027 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph" (OuterVolumeSpecName: "kube-api-access-6w2ph") pod "0072aca6-dd00-419a-afa2-690fcc9712f7" (UID: "0072aca6-dd00-419a-afa2-690fcc9712f7"). InnerVolumeSpecName "kube-api-access-6w2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.021815 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0072aca6-dd00-419a-afa2-690fcc9712f7" (UID: "0072aca6-dd00-419a-afa2-690fcc9712f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.046379 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data" (OuterVolumeSpecName: "config-data") pod "0072aca6-dd00-419a-afa2-690fcc9712f7" (UID: "0072aca6-dd00-419a-afa2-690fcc9712f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.101445 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.101939 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.101958 4733 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0072aca6-dd00-419a-afa2-690fcc9712f7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.101988 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0072aca6-dd00-419a-afa2-690fcc9712f7-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.102077 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w2ph\" (UniqueName: \"kubernetes.io/projected/0072aca6-dd00-419a-afa2-690fcc9712f7-kube-api-access-6w2ph\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.702689 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6484ff4846-p58m9" event={"ID":"0072aca6-dd00-419a-afa2-690fcc9712f7","Type":"ContainerDied","Data":"8f7c3b434e7af52283e61b64ba5fa131c0bb12d38aac464c9f756956df4f77d2"} Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.702759 4733 scope.go:117] "RemoveContainer" containerID="03182fa81abaa155f96533f983619d5a70025b6016e941c60c2e51362d697ef5" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.702706 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6484ff4846-p58m9" Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.733199 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.742729 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6484ff4846-p58m9"] Dec 06 06:00:15 crc kubenswrapper[4733]: I1206 06:00:15.747768 4733 scope.go:117] "RemoveContainer" containerID="3a88c137d7742462b8db5a5e1b34f51bbef6f313ea2179f039d50a9d1d731e67" Dec 06 06:00:16 crc kubenswrapper[4733]: I1206 06:00:16.499627 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" path="/var/lib/kubelet/pods/0072aca6-dd00-419a-afa2-690fcc9712f7/volumes" Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerStarted","Data":"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d"} Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732730 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732509 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="sg-core" containerID="cri-o://7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee" gracePeriod=30 Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732435 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-central-agent" containerID="cri-o://f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189" gracePeriod=30 Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732567 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="proxy-httpd" containerID="cri-o://20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d" gracePeriod=30 Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.732576 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-notification-agent" containerID="cri-o://6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19" gracePeriod=30 Dec 06 06:00:17 crc kubenswrapper[4733]: I1206 06:00:17.758794 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.387115117 podStartE2EDuration="54.758771788s" podCreationTimestamp="2025-12-06 05:59:23 +0000 UTC" firstStartedPulling="2025-12-06 05:59:24.874364605 +0000 UTC m=+948.739575706" lastFinishedPulling="2025-12-06 06:00:17.246021265 +0000 UTC m=+1001.111232377" observedRunningTime="2025-12-06 06:00:17.754460301 +0000 UTC m=+1001.619671412" watchObservedRunningTime="2025-12-06 06:00:17.758771788 +0000 UTC m=+1001.623982900" Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.011252 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 06:00:18 crc kubenswrapper[4733]: E1206 06:00:18.565830 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache]" Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742186 4733 generic.go:334] "Generic (PLEG): container finished" podID="361a8a1d-f083-427f-a625-eca6a714b768" containerID="20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d" exitCode=0 Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742465 4733 generic.go:334] "Generic (PLEG): container finished" podID="361a8a1d-f083-427f-a625-eca6a714b768" containerID="7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee" exitCode=2 Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742476 4733 generic.go:334] "Generic (PLEG): container finished" podID="361a8a1d-f083-427f-a625-eca6a714b768" containerID="f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189" exitCode=0 Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742248 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerDied","Data":"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d"} Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742520 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerDied","Data":"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee"} Dec 06 06:00:18 crc kubenswrapper[4733]: I1206 06:00:18.742537 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerDied","Data":"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189"} Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.495373 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-544f978b4d-7s676" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.598714 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config\") pod \"2552f10d-828b-4996-a292-32499f7d24cf\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.598859 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle\") pod \"2552f10d-828b-4996-a292-32499f7d24cf\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.598960 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config\") pod \"2552f10d-828b-4996-a292-32499f7d24cf\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.599078 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l265j\" (UniqueName: \"kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j\") pod \"2552f10d-828b-4996-a292-32499f7d24cf\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.599208 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs\") pod \"2552f10d-828b-4996-a292-32499f7d24cf\" (UID: \"2552f10d-828b-4996-a292-32499f7d24cf\") " Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.604290 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j" (OuterVolumeSpecName: "kube-api-access-l265j") pod "2552f10d-828b-4996-a292-32499f7d24cf" (UID: "2552f10d-828b-4996-a292-32499f7d24cf"). InnerVolumeSpecName "kube-api-access-l265j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.605808 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2552f10d-828b-4996-a292-32499f7d24cf" (UID: "2552f10d-828b-4996-a292-32499f7d24cf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.641507 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2552f10d-828b-4996-a292-32499f7d24cf" (UID: "2552f10d-828b-4996-a292-32499f7d24cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.643989 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config" (OuterVolumeSpecName: "config") pod "2552f10d-828b-4996-a292-32499f7d24cf" (UID: "2552f10d-828b-4996-a292-32499f7d24cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.658878 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2552f10d-828b-4996-a292-32499f7d24cf" (UID: "2552f10d-828b-4996-a292-32499f7d24cf"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.701201 4733 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.701320 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.701386 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.701464 4733 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2552f10d-828b-4996-a292-32499f7d24cf-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.701523 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l265j\" (UniqueName: \"kubernetes.io/projected/2552f10d-828b-4996-a292-32499f7d24cf-kube-api-access-l265j\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.754723 4733 generic.go:334] "Generic (PLEG): container finished" podID="2552f10d-828b-4996-a292-32499f7d24cf" containerID="15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f" exitCode=0 Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.754775 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerDied","Data":"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f"} Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.754807 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-544f978b4d-7s676" event={"ID":"2552f10d-828b-4996-a292-32499f7d24cf","Type":"ContainerDied","Data":"a55cada98f98ee4f3783ee6ea5cecf4af2ecd70dadf320423655a4be9f340b54"} Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.754830 4733 scope.go:117] "RemoveContainer" containerID="aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.754841 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-544f978b4d-7s676" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.789675 4733 scope.go:117] "RemoveContainer" containerID="15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.791159 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.797113 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-544f978b4d-7s676"] Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.807713 4733 scope.go:117] "RemoveContainer" containerID="aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3" Dec 06 06:00:19 crc kubenswrapper[4733]: E1206 06:00:19.808082 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3\": container with ID starting with aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3 not found: ID does not exist" containerID="aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.808114 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3"} err="failed to get container status \"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3\": rpc error: code = NotFound desc = could not find container \"aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3\": container with ID starting with aac218172634004d843b3bfdba9e75f9bcab81037e8b068bd921180157eb91c3 not found: ID does not exist" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.808140 4733 scope.go:117] "RemoveContainer" containerID="15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f" Dec 06 06:00:19 crc kubenswrapper[4733]: E1206 06:00:19.808510 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f\": container with ID starting with 15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f not found: ID does not exist" containerID="15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f" Dec 06 06:00:19 crc kubenswrapper[4733]: I1206 06:00:19.808534 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f"} err="failed to get container status \"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f\": rpc error: code = NotFound desc = could not find container \"15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f\": container with ID starting with 15b1b8fdcbe45ed49e6b8a618a7c981bb508b621da5f820c159554b8feea9d7f not found: ID does not exist" Dec 06 06:00:20 crc kubenswrapper[4733]: I1206 06:00:20.494031 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2552f10d-828b-4996-a292-32499f7d24cf" path="/var/lib/kubelet/pods/2552f10d-828b-4996-a292-32499f7d24cf/volumes" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.536945 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.640989 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641143 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkpsd\" (UniqueName: \"kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641373 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641454 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641513 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641560 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.641642 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts\") pod \"361a8a1d-f083-427f-a625-eca6a714b768\" (UID: \"361a8a1d-f083-427f-a625-eca6a714b768\") " Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.645482 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.645900 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.650034 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts" (OuterVolumeSpecName: "scripts") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.653088 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd" (OuterVolumeSpecName: "kube-api-access-lkpsd") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "kube-api-access-lkpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.668514 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.700053 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.720359 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data" (OuterVolumeSpecName: "config-data") pod "361a8a1d-f083-427f-a625-eca6a714b768" (UID: "361a8a1d-f083-427f-a625-eca6a714b768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744498 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744527 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744541 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkpsd\" (UniqueName: \"kubernetes.io/projected/361a8a1d-f083-427f-a625-eca6a714b768-kube-api-access-lkpsd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744552 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744562 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744571 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/361a8a1d-f083-427f-a625-eca6a714b768-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.744580 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/361a8a1d-f083-427f-a625-eca6a714b768-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.775214 4733 generic.go:334] "Generic (PLEG): container finished" podID="361a8a1d-f083-427f-a625-eca6a714b768" containerID="6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19" exitCode=0 Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.775250 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerDied","Data":"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19"} Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.775283 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.775316 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"361a8a1d-f083-427f-a625-eca6a714b768","Type":"ContainerDied","Data":"81918e52ed8a85c2b58bb0ee03c9fc00767dc78c2bc96b2f6d03a1cd0af60545"} Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.775342 4733 scope.go:117] "RemoveContainer" containerID="20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.801653 4733 scope.go:117] "RemoveContainer" containerID="7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.817766 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.822050 4733 scope.go:117] "RemoveContainer" containerID="6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.823137 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832413 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832795 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api-log" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832813 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api-log" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832827 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-central-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832833 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-central-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832842 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-api" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832847 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-api" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832861 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="proxy-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832867 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="proxy-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832877 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832890 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832912 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="sg-core" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832919 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="sg-core" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832937 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-notification-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832943 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-notification-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.832961 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.832966 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833192 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api-log" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833209 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="sg-core" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833222 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-notification-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833233 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="ceilometer-central-agent" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833240 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833250 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552f10d-828b-4996-a292-32499f7d24cf" containerName="neutron-api" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833256 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0072aca6-dd00-419a-afa2-690fcc9712f7" containerName="barbican-api" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.833267 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="361a8a1d-f083-427f-a625-eca6a714b768" containerName="proxy-httpd" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.835782 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.840657 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.840675 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.840723 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.846475 4733 scope.go:117] "RemoveContainer" containerID="f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.878896 4733 scope.go:117] "RemoveContainer" containerID="20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.879380 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d\": container with ID starting with 20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d not found: ID does not exist" containerID="20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.879412 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d"} err="failed to get container status \"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d\": rpc error: code = NotFound desc = could not find container \"20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d\": container with ID starting with 20fd576d8af049aedd30a2225fabfcf6f308e5bea49e71501714e142bb54db6d not found: ID does not exist" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.879455 4733 scope.go:117] "RemoveContainer" containerID="7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.879857 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee\": container with ID starting with 7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee not found: ID does not exist" containerID="7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.879933 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee"} err="failed to get container status \"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee\": rpc error: code = NotFound desc = could not find container \"7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee\": container with ID starting with 7a58b0c5c0003849fab31ab3e5339914ecc1e0e1f591a8811c2597551284b4ee not found: ID does not exist" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.879995 4733 scope.go:117] "RemoveContainer" containerID="6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.880348 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19\": container with ID starting with 6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19 not found: ID does not exist" containerID="6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.880369 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19"} err="failed to get container status \"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19\": rpc error: code = NotFound desc = could not find container \"6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19\": container with ID starting with 6effb9b38a055bb68c240785469d2c8a97add4ebf1489376116d3b5682ca1b19 not found: ID does not exist" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.880383 4733 scope.go:117] "RemoveContainer" containerID="f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189" Dec 06 06:00:21 crc kubenswrapper[4733]: E1206 06:00:21.880752 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189\": container with ID starting with f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189 not found: ID does not exist" containerID="f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.880807 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189"} err="failed to get container status \"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189\": rpc error: code = NotFound desc = could not find container \"f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189\": container with ID starting with f47832e7bb750734a6e152882f9c1bb2fccb93da65bca8c763810c078cebc189 not found: ID does not exist" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.949737 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.949886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.949999 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.950100 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.950182 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.950278 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:21 crc kubenswrapper[4733]: I1206 06:00:21.950376 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s94\" (UniqueName: \"kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053415 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053470 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053498 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053518 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053544 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053573 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s94\" (UniqueName: \"kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.053866 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.054504 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.058293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.058492 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.058899 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.059524 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.069032 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s94\" (UniqueName: \"kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94\") pod \"ceilometer-0\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.164928 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.497168 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361a8a1d-f083-427f-a625-eca6a714b768" path="/var/lib/kubelet/pods/361a8a1d-f083-427f-a625-eca6a714b768/volumes" Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.577137 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:22 crc kubenswrapper[4733]: I1206 06:00:22.789986 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerStarted","Data":"1a656de085a657efb3d88eafedb2dcda13dda5d7d108aef24775c1f6a7165734"} Dec 06 06:00:23 crc kubenswrapper[4733]: I1206 06:00:23.248638 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 06:00:23 crc kubenswrapper[4733]: I1206 06:00:23.802940 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerStarted","Data":"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc"} Dec 06 06:00:24 crc kubenswrapper[4733]: I1206 06:00:24.817476 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerStarted","Data":"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9"} Dec 06 06:00:25 crc kubenswrapper[4733]: I1206 06:00:25.675449 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 06:00:25 crc kubenswrapper[4733]: I1206 06:00:25.828890 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerStarted","Data":"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345"} Dec 06 06:00:26 crc kubenswrapper[4733]: I1206 06:00:26.676381 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8696d9b56-5s4w8" Dec 06 06:00:27 crc kubenswrapper[4733]: I1206 06:00:27.855571 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerStarted","Data":"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4"} Dec 06 06:00:27 crc kubenswrapper[4733]: I1206 06:00:27.855899 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:00:28 crc kubenswrapper[4733]: I1206 06:00:28.238458 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-fff9b86f5-qw8vr" Dec 06 06:00:28 crc kubenswrapper[4733]: I1206 06:00:28.262149 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9848737119999997 podStartE2EDuration="7.262128455s" podCreationTimestamp="2025-12-06 06:00:21 +0000 UTC" firstStartedPulling="2025-12-06 06:00:22.598266374 +0000 UTC m=+1006.463477474" lastFinishedPulling="2025-12-06 06:00:26.875521107 +0000 UTC m=+1010.740732217" observedRunningTime="2025-12-06 06:00:27.877450626 +0000 UTC m=+1011.742661737" watchObservedRunningTime="2025-12-06 06:00:28.262128455 +0000 UTC m=+1012.127339566" Dec 06 06:00:28 crc kubenswrapper[4733]: E1206 06:00:28.781206 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e93b6_7230_41f1_98f5_18b252d0d724.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice/crio-423a81d606ec6e867b74e7b758975d267201cff97ab41331b44d8e712972e2c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706be213_5f03_414a_bdeb_98af90de90f4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.848509 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.894155 4733 generic.go:334] "Generic (PLEG): container finished" podID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerID="cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc" exitCode=137 Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.894204 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerDied","Data":"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc"} Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.894211 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.894233 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca","Type":"ContainerDied","Data":"93443507e189a2cd4cda7583e89efd0925b125f79663d977dddd14d3837f8ab0"} Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.894251 4733 scope.go:117] "RemoveContainer" containerID="cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.916885 4733 scope.go:117] "RemoveContainer" containerID="485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.942468 4733 scope.go:117] "RemoveContainer" containerID="cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc" Dec 06 06:00:31 crc kubenswrapper[4733]: E1206 06:00:31.943012 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc\": container with ID starting with cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc not found: ID does not exist" containerID="cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.943065 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc"} err="failed to get container status \"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc\": rpc error: code = NotFound desc = could not find container \"cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc\": container with ID starting with cb4d89c9b33af0d50630dbd68d815ce9b13cf34e9840b8505bf25dfca5556dbc not found: ID does not exist" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.943093 4733 scope.go:117] "RemoveContainer" containerID="485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd" Dec 06 06:00:31 crc kubenswrapper[4733]: E1206 06:00:31.943604 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd\": container with ID starting with 485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd not found: ID does not exist" containerID="485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.943645 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd"} err="failed to get container status \"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd\": rpc error: code = NotFound desc = could not find container \"485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd\": container with ID starting with 485eebaf972e7a786035e5f091a396f32682836448db6f3f9ab51c3a700230dd not found: ID does not exist" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955439 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955506 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955577 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955628 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955656 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.955859 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.956300 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.956435 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftv4s\" (UniqueName: \"kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s\") pod \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\" (UID: \"0edecbb2-ca0e-46f0-b142-19aaf7aa25ca\") " Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.956523 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs" (OuterVolumeSpecName: "logs") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.957705 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.957726 4733 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.963596 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.963628 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts" (OuterVolumeSpecName: "scripts") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.965925 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s" (OuterVolumeSpecName: "kube-api-access-ftv4s") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "kube-api-access-ftv4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:31 crc kubenswrapper[4733]: I1206 06:00:31.986911 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.017370 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data" (OuterVolumeSpecName: "config-data") pod "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" (UID: "0edecbb2-ca0e-46f0-b142-19aaf7aa25ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.059093 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.059130 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftv4s\" (UniqueName: \"kubernetes.io/projected/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-kube-api-access-ftv4s\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.059141 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.059149 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.059160 4733 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.225340 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.233966 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.244162 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 06:00:32 crc kubenswrapper[4733]: E1206 06:00:32.244564 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api-log" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.244585 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api-log" Dec 06 06:00:32 crc kubenswrapper[4733]: E1206 06:00:32.244593 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.244599 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.244813 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.244833 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" containerName="cinder-api-log" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.247717 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.250510 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.250613 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.250520 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.254490 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.364284 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a73638-cf54-461c-a23a-db691593febc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.364351 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.364580 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.364917 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjffc\" (UniqueName: \"kubernetes.io/projected/a2a73638-cf54-461c-a23a-db691593febc-kube-api-access-qjffc\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.365024 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-scripts\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.365178 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.365285 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a73638-cf54-461c-a23a-db691593febc-logs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.365398 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.365462 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.438827 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-67756896f9-p6bgt"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.440159 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.442063 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.442165 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.442260 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.452643 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67756896f9-p6bgt"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.466946 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-log-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.466994 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467042 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467068 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a73638-cf54-461c-a23a-db691593febc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467618 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-public-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467646 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467684 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-config-data\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467701 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467738 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-internal-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467756 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-run-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467773 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-etc-swift\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467797 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjffc\" (UniqueName: \"kubernetes.io/projected/a2a73638-cf54-461c-a23a-db691593febc-kube-api-access-qjffc\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467820 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-combined-ca-bundle\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467836 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-scripts\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467891 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9rg\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-kube-api-access-nb9rg\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467908 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a73638-cf54-461c-a23a-db691593febc-logs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.468185 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a73638-cf54-461c-a23a-db691593febc-logs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.467180 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a73638-cf54-461c-a23a-db691593febc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.483095 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.484026 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-scripts\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.489392 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.489744 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.489943 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.494742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2a73638-cf54-461c-a23a-db691593febc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.505715 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjffc\" (UniqueName: \"kubernetes.io/projected/a2a73638-cf54-461c-a23a-db691593febc-kube-api-access-qjffc\") pod \"cinder-api-0\" (UID: \"a2a73638-cf54-461c-a23a-db691593febc\") " pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.509749 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edecbb2-ca0e-46f0-b142-19aaf7aa25ca" path="/var/lib/kubelet/pods/0edecbb2-ca0e-46f0-b142-19aaf7aa25ca/volumes" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.564858 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571506 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9rg\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-kube-api-access-nb9rg\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571554 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-log-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571600 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-public-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571646 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-config-data\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571691 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-internal-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571706 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-run-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-etc-swift\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.571755 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-combined-ca-bundle\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.576996 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-config-data\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.580102 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-combined-ca-bundle\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.580670 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-run-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.581613 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fbf061-d2a6-4265-b412-cbbcdc78515f-log-httpd\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.587784 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-internal-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.588916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80fbf061-d2a6-4265-b412-cbbcdc78515f-public-tls-certs\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.590176 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-etc-swift\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.636567 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9rg\" (UniqueName: \"kubernetes.io/projected/80fbf061-d2a6-4265-b412-cbbcdc78515f-kube-api-access-nb9rg\") pod \"swift-proxy-67756896f9-p6bgt\" (UID: \"80fbf061-d2a6-4265-b412-cbbcdc78515f\") " pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.758725 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.933730 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.935364 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.937872 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4brc7" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.938057 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.943950 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.955208 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.979755 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.979979 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.980013 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:32 crc kubenswrapper[4733]: I1206 06:00:32.980036 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5w42\" (UniqueName: \"kubernetes.io/projected/8b5bd873-9187-4b04-9274-fd413c995524-kube-api-access-r5w42\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: W1206 06:00:33.020033 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a73638_cf54_461c_a23a_db691593febc.slice/crio-8b906d0b8dd3efa669b2b0657700ef24ae7b6e701d0f0c952cb676e068a2b484 WatchSource:0}: Error finding container 8b906d0b8dd3efa669b2b0657700ef24ae7b6e701d0f0c952cb676e068a2b484: Status 404 returned error can't find the container with id 8b906d0b8dd3efa669b2b0657700ef24ae7b6e701d0f0c952cb676e068a2b484 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.020582 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.082197 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.082323 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.082353 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.082373 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5w42\" (UniqueName: \"kubernetes.io/projected/8b5bd873-9187-4b04-9274-fd413c995524-kube-api-access-r5w42\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.083411 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.087634 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.087965 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b5bd873-9187-4b04-9274-fd413c995524-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.096444 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5w42\" (UniqueName: \"kubernetes.io/projected/8b5bd873-9187-4b04-9274-fd413c995524-kube-api-access-r5w42\") pod \"openstackclient\" (UID: \"8b5bd873-9187-4b04-9274-fd413c995524\") " pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.255971 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.269838 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67756896f9-p6bgt"] Dec 06 06:00:33 crc kubenswrapper[4733]: W1206 06:00:33.285783 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80fbf061_d2a6_4265_b412_cbbcdc78515f.slice/crio-86bfccb1779f9d6b7b0a9e4894d1f8810c82be6cad923cd6c97ee8ed2f2c1f1f WatchSource:0}: Error finding container 86bfccb1779f9d6b7b0a9e4894d1f8810c82be6cad923cd6c97ee8ed2f2c1f1f: Status 404 returned error can't find the container with id 86bfccb1779f9d6b7b0a9e4894d1f8810c82be6cad923cd6c97ee8ed2f2c1f1f Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.567220 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.567886 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="sg-core" containerID="cri-o://1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345" gracePeriod=30 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.568042 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="proxy-httpd" containerID="cri-o://2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4" gracePeriod=30 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.568221 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-notification-agent" containerID="cri-o://edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9" gracePeriod=30 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.568997 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-central-agent" containerID="cri-o://e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc" gracePeriod=30 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.701265 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.921820 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8b5bd873-9187-4b04-9274-fd413c995524","Type":"ContainerStarted","Data":"13557278ba2c7968022eee163aa6609b8df3aeb318c89fab3f52a29180e39e13"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.923577 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2a73638-cf54-461c-a23a-db691593febc","Type":"ContainerStarted","Data":"604b30d4913f337fc5bdbbacad8b3c422379db8b000bc39ae3fa981d90a71843"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.923670 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2a73638-cf54-461c-a23a-db691593febc","Type":"ContainerStarted","Data":"8b906d0b8dd3efa669b2b0657700ef24ae7b6e701d0f0c952cb676e068a2b484"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.926031 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67756896f9-p6bgt" event={"ID":"80fbf061-d2a6-4265-b412-cbbcdc78515f","Type":"ContainerStarted","Data":"af01822a07e7a7971d2da0ac350e4f9c686d7e76a1c4d6d1a13b855736fe3564"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.926064 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67756896f9-p6bgt" event={"ID":"80fbf061-d2a6-4265-b412-cbbcdc78515f","Type":"ContainerStarted","Data":"0b60ffde2a0a196bf009c6d904ea6b5aa03770208a0c0f4e13be49d393c3e706"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.926077 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67756896f9-p6bgt" event={"ID":"80fbf061-d2a6-4265-b412-cbbcdc78515f","Type":"ContainerStarted","Data":"86bfccb1779f9d6b7b0a9e4894d1f8810c82be6cad923cd6c97ee8ed2f2c1f1f"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.926513 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.926614 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929534 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerID="2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4" exitCode=0 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929558 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerID="1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345" exitCode=2 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929567 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerID="e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc" exitCode=0 Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929588 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerDied","Data":"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929604 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerDied","Data":"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.929615 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerDied","Data":"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc"} Dec 06 06:00:33 crc kubenswrapper[4733]: I1206 06:00:33.953396 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-67756896f9-p6bgt" podStartSLOduration=1.953378105 podStartE2EDuration="1.953378105s" podCreationTimestamp="2025-12-06 06:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:33.941297038 +0000 UTC m=+1017.806508148" watchObservedRunningTime="2025-12-06 06:00:33.953378105 +0000 UTC m=+1017.818589216" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.747342 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.829845 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6s94\" (UniqueName: \"kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.829931 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.830023 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.830102 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.830164 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.830253 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.830294 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts\") pod \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\" (UID: \"cc57577b-28dc-4c6d-b724-fe20f24d8ff6\") " Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.831965 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.832551 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.837633 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts" (OuterVolumeSpecName: "scripts") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.862450 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94" (OuterVolumeSpecName: "kube-api-access-c6s94") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "kube-api-access-c6s94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.867521 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.921879 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932797 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932823 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932834 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6s94\" (UniqueName: \"kubernetes.io/projected/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-kube-api-access-c6s94\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932846 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932855 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.932868 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.945691 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerID="edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9" exitCode=0 Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.945756 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerDied","Data":"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9"} Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.945790 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc57577b-28dc-4c6d-b724-fe20f24d8ff6","Type":"ContainerDied","Data":"1a656de085a657efb3d88eafedb2dcda13dda5d7d108aef24775c1f6a7165734"} Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.945808 4733 scope.go:117] "RemoveContainer" containerID="2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.945957 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.961386 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2a73638-cf54-461c-a23a-db691593febc","Type":"ContainerStarted","Data":"95a507f8813d36c0728647708de6fc1ba8cd5d1a0cc35e3613ac7ea2a4cd153b"} Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.961584 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.965475 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data" (OuterVolumeSpecName: "config-data") pod "cc57577b-28dc-4c6d-b724-fe20f24d8ff6" (UID: "cc57577b-28dc-4c6d-b724-fe20f24d8ff6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.974623 4733 scope.go:117] "RemoveContainer" containerID="1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.987717 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.987696481 podStartE2EDuration="2.987696481s" podCreationTimestamp="2025-12-06 06:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:34.977709111 +0000 UTC m=+1018.842920222" watchObservedRunningTime="2025-12-06 06:00:34.987696481 +0000 UTC m=+1018.852907592" Dec 06 06:00:34 crc kubenswrapper[4733]: I1206 06:00:34.996465 4733 scope.go:117] "RemoveContainer" containerID="edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.015403 4733 scope.go:117] "RemoveContainer" containerID="e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.039061 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc57577b-28dc-4c6d-b724-fe20f24d8ff6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.039893 4733 scope.go:117] "RemoveContainer" containerID="2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.040570 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4\": container with ID starting with 2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4 not found: ID does not exist" containerID="2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.040608 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4"} err="failed to get container status \"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4\": rpc error: code = NotFound desc = could not find container \"2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4\": container with ID starting with 2d9f36b95db6ade44ff77e68bfa475b60b47f4b552fcb191988e3732e582aaa4 not found: ID does not exist" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.040631 4733 scope.go:117] "RemoveContainer" containerID="1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.048771 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345\": container with ID starting with 1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345 not found: ID does not exist" containerID="1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.048804 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345"} err="failed to get container status \"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345\": rpc error: code = NotFound desc = could not find container \"1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345\": container with ID starting with 1ca3a96b5577992ca51233ec50c66cc32d1edebf5d3d52b66feb428819f54345 not found: ID does not exist" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.048824 4733 scope.go:117] "RemoveContainer" containerID="edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.050792 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9\": container with ID starting with edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9 not found: ID does not exist" containerID="edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.050819 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9"} err="failed to get container status \"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9\": rpc error: code = NotFound desc = could not find container \"edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9\": container with ID starting with edc80ffc78f34d643cb316a10bc9bd8d75f5e5f9cb017f9d3d499227bd47a9d9 not found: ID does not exist" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.050836 4733 scope.go:117] "RemoveContainer" containerID="e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.051100 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc\": container with ID starting with e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc not found: ID does not exist" containerID="e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.051129 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc"} err="failed to get container status \"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc\": rpc error: code = NotFound desc = could not find container \"e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc\": container with ID starting with e4f82661685c9f8d3addf826ec0c653d62cec3368997f48cd2c3b5ac39a487dc not found: ID does not exist" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.279433 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.287692 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.294838 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.295212 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-notification-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295230 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-notification-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.295249 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="proxy-httpd" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295255 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="proxy-httpd" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.295276 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="sg-core" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295281 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="sg-core" Dec 06 06:00:35 crc kubenswrapper[4733]: E1206 06:00:35.295292 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-central-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295351 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-central-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295548 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="proxy-httpd" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295575 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-notification-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295586 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="ceilometer-central-agent" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.295596 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" containerName="sg-core" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.297152 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.300942 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.301269 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.324603 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345105 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345213 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345254 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsb2g\" (UniqueName: \"kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345279 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345330 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.345360 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449388 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449588 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449647 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsb2g\" (UniqueName: \"kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449703 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449747 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.449787 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.450504 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.450776 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.456230 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.457525 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.459017 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.461446 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.471230 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsb2g\" (UniqueName: \"kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g\") pod \"ceilometer-0\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " pod="openstack/ceilometer-0" Dec 06 06:00:35 crc kubenswrapper[4733]: I1206 06:00:35.614507 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:36 crc kubenswrapper[4733]: I1206 06:00:36.039599 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:36 crc kubenswrapper[4733]: W1206 06:00:36.045326 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod512fab18_cdab_4502_86dd_33b5f71ddd85.slice/crio-a0a1785c2c385494e19f61110fddb39281f1b3829cede309d50f8e5c0d4be995 WatchSource:0}: Error finding container a0a1785c2c385494e19f61110fddb39281f1b3829cede309d50f8e5c0d4be995: Status 404 returned error can't find the container with id a0a1785c2c385494e19f61110fddb39281f1b3829cede309d50f8e5c0d4be995 Dec 06 06:00:36 crc kubenswrapper[4733]: I1206 06:00:36.526702 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc57577b-28dc-4c6d-b724-fe20f24d8ff6" path="/var/lib/kubelet/pods/cc57577b-28dc-4c6d-b724-fe20f24d8ff6/volumes" Dec 06 06:00:36 crc kubenswrapper[4733]: I1206 06:00:36.994031 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerStarted","Data":"f965c114f9c5fabed3bad2c8cee2129f0a93b1ec7311d5e119dbe8bae332e7b3"} Dec 06 06:00:36 crc kubenswrapper[4733]: I1206 06:00:36.994369 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerStarted","Data":"a0a1785c2c385494e19f61110fddb39281f1b3829cede309d50f8e5c0d4be995"} Dec 06 06:00:38 crc kubenswrapper[4733]: I1206 06:00:38.004468 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerStarted","Data":"b690f1f85492cdb11530ec769f1d5523fa92ce33cb4dab23e1b8c9491f060417"} Dec 06 06:00:39 crc kubenswrapper[4733]: I1206 06:00:39.053570 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerStarted","Data":"3a1ec94ed9ecc79408e0c5a79cda7d26e967165cd4a7bed791c3f5c8e06e8a1a"} Dec 06 06:00:40 crc kubenswrapper[4733]: I1206 06:00:40.083551 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerStarted","Data":"d97111c2461f508aa0dfede96e3862f3a0abae7c0bf1c5d34201d79555ad1bf0"} Dec 06 06:00:40 crc kubenswrapper[4733]: I1206 06:00:40.084374 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:00:42 crc kubenswrapper[4733]: I1206 06:00:42.786488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:42 crc kubenswrapper[4733]: I1206 06:00:42.788339 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67756896f9-p6bgt" Dec 06 06:00:42 crc kubenswrapper[4733]: I1206 06:00:42.809908 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.186818858 podStartE2EDuration="7.809886403s" podCreationTimestamp="2025-12-06 06:00:35 +0000 UTC" firstStartedPulling="2025-12-06 06:00:36.048069442 +0000 UTC m=+1019.913280552" lastFinishedPulling="2025-12-06 06:00:39.671136986 +0000 UTC m=+1023.536348097" observedRunningTime="2025-12-06 06:00:40.111789082 +0000 UTC m=+1023.977000193" watchObservedRunningTime="2025-12-06 06:00:42.809886403 +0000 UTC m=+1026.675097515" Dec 06 06:00:42 crc kubenswrapper[4733]: I1206 06:00:42.989386 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:00:42 crc kubenswrapper[4733]: I1206 06:00:42.989452 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:00:43 crc kubenswrapper[4733]: I1206 06:00:43.710437 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:43 crc kubenswrapper[4733]: I1206 06:00:43.710987 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-central-agent" containerID="cri-o://f965c114f9c5fabed3bad2c8cee2129f0a93b1ec7311d5e119dbe8bae332e7b3" gracePeriod=30 Dec 06 06:00:43 crc kubenswrapper[4733]: I1206 06:00:43.711064 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="sg-core" containerID="cri-o://3a1ec94ed9ecc79408e0c5a79cda7d26e967165cd4a7bed791c3f5c8e06e8a1a" gracePeriod=30 Dec 06 06:00:43 crc kubenswrapper[4733]: I1206 06:00:43.711125 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-notification-agent" containerID="cri-o://b690f1f85492cdb11530ec769f1d5523fa92ce33cb4dab23e1b8c9491f060417" gracePeriod=30 Dec 06 06:00:43 crc kubenswrapper[4733]: I1206 06:00:43.711223 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="proxy-httpd" containerID="cri-o://d97111c2461f508aa0dfede96e3862f3a0abae7c0bf1c5d34201d79555ad1bf0" gracePeriod=30 Dec 06 06:00:44 crc kubenswrapper[4733]: I1206 06:00:44.134616 4733 generic.go:334] "Generic (PLEG): container finished" podID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerID="d97111c2461f508aa0dfede96e3862f3a0abae7c0bf1c5d34201d79555ad1bf0" exitCode=0 Dec 06 06:00:44 crc kubenswrapper[4733]: I1206 06:00:44.134655 4733 generic.go:334] "Generic (PLEG): container finished" podID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerID="3a1ec94ed9ecc79408e0c5a79cda7d26e967165cd4a7bed791c3f5c8e06e8a1a" exitCode=2 Dec 06 06:00:44 crc kubenswrapper[4733]: I1206 06:00:44.134676 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerDied","Data":"d97111c2461f508aa0dfede96e3862f3a0abae7c0bf1c5d34201d79555ad1bf0"} Dec 06 06:00:44 crc kubenswrapper[4733]: I1206 06:00:44.134707 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerDied","Data":"3a1ec94ed9ecc79408e0c5a79cda7d26e967165cd4a7bed791c3f5c8e06e8a1a"} Dec 06 06:00:44 crc kubenswrapper[4733]: I1206 06:00:44.262247 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 06:00:45 crc kubenswrapper[4733]: I1206 06:00:45.145451 4733 generic.go:334] "Generic (PLEG): container finished" podID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerID="b690f1f85492cdb11530ec769f1d5523fa92ce33cb4dab23e1b8c9491f060417" exitCode=0 Dec 06 06:00:45 crc kubenswrapper[4733]: I1206 06:00:45.145717 4733 generic.go:334] "Generic (PLEG): container finished" podID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerID="f965c114f9c5fabed3bad2c8cee2129f0a93b1ec7311d5e119dbe8bae332e7b3" exitCode=0 Dec 06 06:00:45 crc kubenswrapper[4733]: I1206 06:00:45.145608 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerDied","Data":"b690f1f85492cdb11530ec769f1d5523fa92ce33cb4dab23e1b8c9491f060417"} Dec 06 06:00:45 crc kubenswrapper[4733]: I1206 06:00:45.145761 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerDied","Data":"f965c114f9c5fabed3bad2c8cee2129f0a93b1ec7311d5e119dbe8bae332e7b3"} Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.558126 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.676982 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677100 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677231 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb2g\" (UniqueName: \"kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677399 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677475 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677622 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd\") pod \"512fab18-cdab-4502-86dd-33b5f71ddd85\" (UID: \"512fab18-cdab-4502-86dd-33b5f71ddd85\") " Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.677983 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.678287 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.678667 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.678689 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/512fab18-cdab-4502-86dd-33b5f71ddd85-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.684190 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts" (OuterVolumeSpecName: "scripts") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.684200 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g" (OuterVolumeSpecName: "kube-api-access-zsb2g") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "kube-api-access-zsb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.709260 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.744347 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.771605 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data" (OuterVolumeSpecName: "config-data") pod "512fab18-cdab-4502-86dd-33b5f71ddd85" (UID: "512fab18-cdab-4502-86dd-33b5f71ddd85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.780331 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.780370 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.780381 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsb2g\" (UniqueName: \"kubernetes.io/projected/512fab18-cdab-4502-86dd-33b5f71ddd85-kube-api-access-zsb2g\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.780394 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:47 crc kubenswrapper[4733]: I1206 06:00:47.780403 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/512fab18-cdab-4502-86dd-33b5f71ddd85-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.181511 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"512fab18-cdab-4502-86dd-33b5f71ddd85","Type":"ContainerDied","Data":"a0a1785c2c385494e19f61110fddb39281f1b3829cede309d50f8e5c0d4be995"} Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.182029 4733 scope.go:117] "RemoveContainer" containerID="d97111c2461f508aa0dfede96e3862f3a0abae7c0bf1c5d34201d79555ad1bf0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.181555 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.184084 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8b5bd873-9187-4b04-9274-fd413c995524","Type":"ContainerStarted","Data":"305b904e89bccac809f84cb82efc75b884fd9afda0e5d4bee6192ce949a852e7"} Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.217792 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.615740936 podStartE2EDuration="16.217774607s" podCreationTimestamp="2025-12-06 06:00:32 +0000 UTC" firstStartedPulling="2025-12-06 06:00:33.725963594 +0000 UTC m=+1017.591174704" lastFinishedPulling="2025-12-06 06:00:47.327997264 +0000 UTC m=+1031.193208375" observedRunningTime="2025-12-06 06:00:48.21497923 +0000 UTC m=+1032.080190342" watchObservedRunningTime="2025-12-06 06:00:48.217774607 +0000 UTC m=+1032.082985718" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.221049 4733 scope.go:117] "RemoveContainer" containerID="3a1ec94ed9ecc79408e0c5a79cda7d26e967165cd4a7bed791c3f5c8e06e8a1a" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.243451 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.249840 4733 scope.go:117] "RemoveContainer" containerID="b690f1f85492cdb11530ec769f1d5523fa92ce33cb4dab23e1b8c9491f060417" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.259885 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.266971 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:48 crc kubenswrapper[4733]: E1206 06:00:48.267404 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="proxy-httpd" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267425 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="proxy-httpd" Dec 06 06:00:48 crc kubenswrapper[4733]: E1206 06:00:48.267457 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-central-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267465 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-central-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: E1206 06:00:48.267483 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-notification-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267489 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-notification-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: E1206 06:00:48.267504 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="sg-core" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267510 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="sg-core" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267704 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="sg-core" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267722 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-central-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267732 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="ceilometer-notification-agent" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.267756 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" containerName="proxy-httpd" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.278226 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.278375 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.281864 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.282017 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.283281 4733 scope.go:117] "RemoveContainer" containerID="f965c114f9c5fabed3bad2c8cee2129f0a93b1ec7311d5e119dbe8bae332e7b3" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392491 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392601 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392619 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392652 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w26q\" (UniqueName: \"kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392703 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.392808 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494520 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494585 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494629 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494654 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w26q\" (UniqueName: \"kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494679 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.494729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.496506 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512fab18-cdab-4502-86dd-33b5f71ddd85" path="/var/lib/kubelet/pods/512fab18-cdab-4502-86dd-33b5f71ddd85/volumes" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.498193 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.499229 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.501720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.501976 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.502046 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.503041 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.515535 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w26q\" (UniqueName: \"kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q\") pod \"ceilometer-0\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " pod="openstack/ceilometer-0" Dec 06 06:00:48 crc kubenswrapper[4733]: I1206 06:00:48.625801 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:49 crc kubenswrapper[4733]: I1206 06:00:49.061109 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:49 crc kubenswrapper[4733]: I1206 06:00:49.197057 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerStarted","Data":"51c834e020227e4b314ac238c28f1c0207a01958c7df52056c90b4efeac28b6f"} Dec 06 06:00:50 crc kubenswrapper[4733]: I1206 06:00:50.214694 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerStarted","Data":"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6"} Dec 06 06:00:51 crc kubenswrapper[4733]: I1206 06:00:51.255334 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerStarted","Data":"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664"} Dec 06 06:00:52 crc kubenswrapper[4733]: I1206 06:00:52.265563 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerStarted","Data":"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a"} Dec 06 06:00:53 crc kubenswrapper[4733]: I1206 06:00:53.275158 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerStarted","Data":"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a"} Dec 06 06:00:53 crc kubenswrapper[4733]: I1206 06:00:53.275603 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:00:53 crc kubenswrapper[4733]: I1206 06:00:53.298543 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.48926872 podStartE2EDuration="5.298517089s" podCreationTimestamp="2025-12-06 06:00:48 +0000 UTC" firstStartedPulling="2025-12-06 06:00:49.066756377 +0000 UTC m=+1032.931967487" lastFinishedPulling="2025-12-06 06:00:52.876004745 +0000 UTC m=+1036.741215856" observedRunningTime="2025-12-06 06:00:53.289300016 +0000 UTC m=+1037.154511127" watchObservedRunningTime="2025-12-06 06:00:53.298517089 +0000 UTC m=+1037.163728201" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.300313 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pz8w7"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.301994 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.319388 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pz8w7"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.333890 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.333966 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdqtl\" (UniqueName: \"kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.403378 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-csq6r"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.404919 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.416559 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7733-account-create-update-tvk6f"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.418044 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.419796 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.424733 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-csq6r"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.440344 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7733-account-create-update-tvk6f"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441401 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441478 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmzj\" (UniqueName: \"kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441552 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29sx5\" (UniqueName: \"kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441595 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441639 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdqtl\" (UniqueName: \"kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.441697 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.442508 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.469234 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdqtl\" (UniqueName: \"kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl\") pod \"nova-api-db-create-pz8w7\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.543515 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29sx5\" (UniqueName: \"kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.543619 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.543702 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.543742 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmzj\" (UniqueName: \"kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.545667 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.546522 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.563509 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29sx5\" (UniqueName: \"kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5\") pod \"nova-cell0-db-create-csq6r\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.564261 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmzj\" (UniqueName: \"kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj\") pod \"nova-api-7733-account-create-update-tvk6f\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.616105 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-be38-account-create-update-gbx8w"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.617610 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.619728 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.620733 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.622741 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hn5h8"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.624163 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.629386 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-be38-account-create-update-gbx8w"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.637471 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hn5h8"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.663908 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.663974 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlsv5\" (UniqueName: \"kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.664076 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wjj\" (UniqueName: \"kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.665007 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.730117 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.743793 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.770505 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.770716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlsv5\" (UniqueName: \"kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.770762 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wjj\" (UniqueName: \"kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.770839 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.771941 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.772416 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.798263 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wjj\" (UniqueName: \"kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj\") pod \"nova-cell1-db-create-hn5h8\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.803744 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlsv5\" (UniqueName: \"kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5\") pod \"nova-cell0-be38-account-create-update-gbx8w\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.810910 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0451-account-create-update-kldhp"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.814263 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.817228 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.844376 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0451-account-create-update-kldhp"] Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.872424 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwqc\" (UniqueName: \"kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.872557 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.975524 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.975812 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwqc\" (UniqueName: \"kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.977091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:55 crc kubenswrapper[4733]: I1206 06:00:55.993182 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwqc\" (UniqueName: \"kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc\") pod \"nova-cell1-0451-account-create-update-kldhp\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.024018 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.029381 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.122556 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pz8w7"] Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.219473 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.258466 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7733-account-create-update-tvk6f"] Dec 06 06:00:56 crc kubenswrapper[4733]: W1206 06:00:56.273018 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb88b7105_3d6e_44a7_965a_1fa56297537b.slice/crio-dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98 WatchSource:0}: Error finding container dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98: Status 404 returned error can't find the container with id dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98 Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.302746 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-csq6r"] Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.307113 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7733-account-create-update-tvk6f" event={"ID":"b88b7105-3d6e-44a7-965a-1fa56297537b","Type":"ContainerStarted","Data":"dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98"} Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.310556 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pz8w7" event={"ID":"328eb573-b5aa-494f-bc1d-588c43edcb9f","Type":"ContainerStarted","Data":"6b510090b7625e14e853cec12589802540367d91165833d308958cbe33274ac6"} Dec 06 06:00:56 crc kubenswrapper[4733]: W1206 06:00:56.313774 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee2e9091_1bb1_4930_bff8_35c3cced4104.slice/crio-a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e WatchSource:0}: Error finding container a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e: Status 404 returned error can't find the container with id a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.514484 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-be38-account-create-update-gbx8w"] Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.631794 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hn5h8"] Dec 06 06:00:56 crc kubenswrapper[4733]: I1206 06:00:56.817977 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0451-account-create-update-kldhp"] Dec 06 06:00:56 crc kubenswrapper[4733]: W1206 06:00:56.821666 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44d743aa_a7d8_4b31_a1f1_35ad3b97b8e8.slice/crio-68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19 WatchSource:0}: Error finding container 68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19: Status 404 returned error can't find the container with id 68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19 Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.349191 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" event={"ID":"bf31d052-0c6b-4d46-8d72-8c7e247aad00","Type":"ContainerStarted","Data":"7cdc3bcc101844cd24bec020eb7bcf98040a1a309c78d2f519f95a295c78e7b3"} Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.350456 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csq6r" event={"ID":"ee2e9091-1bb1-4930-bff8-35c3cced4104","Type":"ContainerStarted","Data":"a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e"} Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.351832 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0451-account-create-update-kldhp" event={"ID":"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8","Type":"ContainerStarted","Data":"68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19"} Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.353258 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pz8w7" event={"ID":"328eb573-b5aa-494f-bc1d-588c43edcb9f","Type":"ContainerStarted","Data":"7d105fb40273c51290de37d9c39ef018dedc6bb34abbbb1a6b76800e15ad13c1"} Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.354114 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn5h8" event={"ID":"e651b5d7-8b07-42b1-9189-af0896ea5a16","Type":"ContainerStarted","Data":"88dbdaa725c3a8255b1570a626bb866e476d8792ce70bdb278e0195fe1d8b05b"} Dec 06 06:00:57 crc kubenswrapper[4733]: I1206 06:00:57.373684 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-pz8w7" podStartSLOduration=2.373673595 podStartE2EDuration="2.373673595s" podCreationTimestamp="2025-12-06 06:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:00:57.370092091 +0000 UTC m=+1041.235303202" watchObservedRunningTime="2025-12-06 06:00:57.373673595 +0000 UTC m=+1041.238884706" Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.062852 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.063876 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="proxy-httpd" containerID="cri-o://46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" gracePeriod=30 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.063928 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="sg-core" containerID="cri-o://ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" gracePeriod=30 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.063953 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-notification-agent" containerID="cri-o://f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" gracePeriod=30 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.063831 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-central-agent" containerID="cri-o://6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" gracePeriod=30 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.365022 4733 generic.go:334] "Generic (PLEG): container finished" podID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerID="46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.365065 4733 generic.go:334] "Generic (PLEG): container finished" podID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerID="ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" exitCode=2 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.365119 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerDied","Data":"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.365280 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerDied","Data":"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.367036 4733 generic.go:334] "Generic (PLEG): container finished" podID="e651b5d7-8b07-42b1-9189-af0896ea5a16" containerID="936611e4791a5c143add94e144fc299803bc0585f65fd4a155c0cf5a56e13e78" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.367134 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn5h8" event={"ID":"e651b5d7-8b07-42b1-9189-af0896ea5a16","Type":"ContainerDied","Data":"936611e4791a5c143add94e144fc299803bc0585f65fd4a155c0cf5a56e13e78"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.368293 4733 generic.go:334] "Generic (PLEG): container finished" podID="b88b7105-3d6e-44a7-965a-1fa56297537b" containerID="499e589efa01ec0ad5894246125f238587c9654498a502d3346649c989f42fb5" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.368391 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7733-account-create-update-tvk6f" event={"ID":"b88b7105-3d6e-44a7-965a-1fa56297537b","Type":"ContainerDied","Data":"499e589efa01ec0ad5894246125f238587c9654498a502d3346649c989f42fb5"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.369840 4733 generic.go:334] "Generic (PLEG): container finished" podID="bf31d052-0c6b-4d46-8d72-8c7e247aad00" containerID="ec5863bb23282b5b1623a103e6b2063fa3ecdb799c7f528c47cef4548d78a72b" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.369943 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" event={"ID":"bf31d052-0c6b-4d46-8d72-8c7e247aad00","Type":"ContainerDied","Data":"ec5863bb23282b5b1623a103e6b2063fa3ecdb799c7f528c47cef4548d78a72b"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.371216 4733 generic.go:334] "Generic (PLEG): container finished" podID="ee2e9091-1bb1-4930-bff8-35c3cced4104" containerID="55126a0002463037c30d301c1e006f0c3f307c99c7c328c4d063526c14287153" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.371276 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csq6r" event={"ID":"ee2e9091-1bb1-4930-bff8-35c3cced4104","Type":"ContainerDied","Data":"55126a0002463037c30d301c1e006f0c3f307c99c7c328c4d063526c14287153"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.372559 4733 generic.go:334] "Generic (PLEG): container finished" podID="44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" containerID="1d6385669e87ea345500fce141ed8552836bf3b4451aeed09496748a392cb62d" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.372615 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0451-account-create-update-kldhp" event={"ID":"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8","Type":"ContainerDied","Data":"1d6385669e87ea345500fce141ed8552836bf3b4451aeed09496748a392cb62d"} Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.373894 4733 generic.go:334] "Generic (PLEG): container finished" podID="328eb573-b5aa-494f-bc1d-588c43edcb9f" containerID="7d105fb40273c51290de37d9c39ef018dedc6bb34abbbb1a6b76800e15ad13c1" exitCode=0 Dec 06 06:00:58 crc kubenswrapper[4733]: I1206 06:00:58.373925 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pz8w7" event={"ID":"328eb573-b5aa-494f-bc1d-588c43edcb9f","Type":"ContainerDied","Data":"7d105fb40273c51290de37d9c39ef018dedc6bb34abbbb1a6b76800e15ad13c1"} Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.128986 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.176449 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w26q\" (UniqueName: \"kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.176635 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.177240 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.183723 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.184014 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.184172 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.184257 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd\") pod \"f05eb385-58e5-4206-8fc3-8fda3efc9457\" (UID: \"f05eb385-58e5-4206-8fc3-8fda3efc9457\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.185051 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.185954 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q" (OuterVolumeSpecName: "kube-api-access-5w26q") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "kube-api-access-5w26q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.186205 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.186318 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w26q\" (UniqueName: \"kubernetes.io/projected/f05eb385-58e5-4206-8fc3-8fda3efc9457-kube-api-access-5w26q\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.186240 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.192926 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts" (OuterVolumeSpecName: "scripts") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.224408 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.259227 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.269032 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data" (OuterVolumeSpecName: "config-data") pod "f05eb385-58e5-4206-8fc3-8fda3efc9457" (UID: "f05eb385-58e5-4206-8fc3-8fda3efc9457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.288037 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.288067 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.288080 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.288090 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f05eb385-58e5-4206-8fc3-8fda3efc9457-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.288100 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f05eb385-58e5-4206-8fc3-8fda3efc9457-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.389999 4733 generic.go:334] "Generic (PLEG): container finished" podID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerID="f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" exitCode=0 Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390041 4733 generic.go:334] "Generic (PLEG): container finished" podID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerID="6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" exitCode=0 Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390099 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390165 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerDied","Data":"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664"} Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390225 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerDied","Data":"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6"} Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390238 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f05eb385-58e5-4206-8fc3-8fda3efc9457","Type":"ContainerDied","Data":"51c834e020227e4b314ac238c28f1c0207a01958c7df52056c90b4efeac28b6f"} Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.390260 4733 scope.go:117] "RemoveContainer" containerID="46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.437029 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.442938 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.456403 4733 scope.go:117] "RemoveContainer" containerID="ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.476949 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.477454 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="sg-core" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477474 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="sg-core" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.477488 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-central-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477495 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-central-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.477520 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="proxy-httpd" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477526 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="proxy-httpd" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.477544 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-notification-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477551 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-notification-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477759 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="proxy-httpd" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477774 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-central-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477784 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="ceilometer-notification-agent" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.477802 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" containerName="sg-core" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.479369 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.483769 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.484542 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.499362 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.533569 4733 scope.go:117] "RemoveContainer" containerID="f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.555271 4733 scope.go:117] "RemoveContainer" containerID="6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.568963 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf05eb385_58e5_4206_8fc3_8fda3efc9457.slice/crio-51c834e020227e4b314ac238c28f1c0207a01958c7df52056c90b4efeac28b6f\": RecentStats: unable to find data in memory cache]" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.576444 4733 scope.go:117] "RemoveContainer" containerID="46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.576904 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a\": container with ID starting with 46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a not found: ID does not exist" containerID="46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.576947 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a"} err="failed to get container status \"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a\": rpc error: code = NotFound desc = could not find container \"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a\": container with ID starting with 46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.576972 4733 scope.go:117] "RemoveContainer" containerID="ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.577366 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a\": container with ID starting with ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a not found: ID does not exist" containerID="ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.577410 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a"} err="failed to get container status \"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a\": rpc error: code = NotFound desc = could not find container \"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a\": container with ID starting with ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.577446 4733 scope.go:117] "RemoveContainer" containerID="f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.577882 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664\": container with ID starting with f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664 not found: ID does not exist" containerID="f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.577910 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664"} err="failed to get container status \"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664\": rpc error: code = NotFound desc = could not find container \"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664\": container with ID starting with f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664 not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.577932 4733 scope.go:117] "RemoveContainer" containerID="6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" Dec 06 06:00:59 crc kubenswrapper[4733]: E1206 06:00:59.578242 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6\": container with ID starting with 6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6 not found: ID does not exist" containerID="6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.578272 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6"} err="failed to get container status \"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6\": rpc error: code = NotFound desc = could not find container \"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6\": container with ID starting with 6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6 not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.578289 4733 scope.go:117] "RemoveContainer" containerID="46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.579406 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a"} err="failed to get container status \"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a\": rpc error: code = NotFound desc = could not find container \"46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a\": container with ID starting with 46ed190b7bc2e0f92a5ccb2b4df2eae97ce6a1309a0e331435e267048a8f707a not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.579425 4733 scope.go:117] "RemoveContainer" containerID="ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.580486 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a"} err="failed to get container status \"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a\": rpc error: code = NotFound desc = could not find container \"ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a\": container with ID starting with ffdf986b200303f48c103b85d0edfb4471e19a21c47c66224b20f4cf9a8b1e5a not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.580528 4733 scope.go:117] "RemoveContainer" containerID="f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.580763 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664"} err="failed to get container status \"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664\": rpc error: code = NotFound desc = could not find container \"f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664\": container with ID starting with f1aaee294be805a7b4e30901f5a5430bb00f5dd2f7f8a037ee9e3afd06855664 not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.580781 4733 scope.go:117] "RemoveContainer" containerID="6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.580952 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6"} err="failed to get container status \"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6\": rpc error: code = NotFound desc = could not find container \"6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6\": container with ID starting with 6469d6034c6630f0525ef587fb354a3b851c275f433fdd34f06256d07c43adc6 not found: ID does not exist" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.597553 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rgvk\" (UniqueName: \"kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.598290 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.598487 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.598560 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.599013 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.599123 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.599163 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702157 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702565 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702717 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702784 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702823 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.702927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rgvk\" (UniqueName: \"kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.703105 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.703635 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.703969 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.714062 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.714631 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.715240 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.716587 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.726820 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rgvk\" (UniqueName: \"kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk\") pod \"ceilometer-0\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.759229 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.759476 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-log" containerID="cri-o://4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3" gracePeriod=30 Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.759632 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-httpd" containerID="cri-o://3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2" gracePeriod=30 Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.794831 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.811918 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.909909 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggmzj\" (UniqueName: \"kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj\") pod \"b88b7105-3d6e-44a7-965a-1fa56297537b\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.910892 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts\") pod \"b88b7105-3d6e-44a7-965a-1fa56297537b\" (UID: \"b88b7105-3d6e-44a7-965a-1fa56297537b\") " Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.914358 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b88b7105-3d6e-44a7-965a-1fa56297537b" (UID: "b88b7105-3d6e-44a7-965a-1fa56297537b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.920848 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj" (OuterVolumeSpecName: "kube-api-access-ggmzj") pod "b88b7105-3d6e-44a7-965a-1fa56297537b" (UID: "b88b7105-3d6e-44a7-965a-1fa56297537b"). InnerVolumeSpecName "kube-api-access-ggmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.967518 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.982162 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:00:59 crc kubenswrapper[4733]: I1206 06:00:59.996131 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.011595 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.013278 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts\") pod \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.013416 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwqc\" (UniqueName: \"kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc\") pod \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\" (UID: \"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.013847 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" (UID: "44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.014171 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88b7105-3d6e-44a7-965a-1fa56297537b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.014187 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.014198 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggmzj\" (UniqueName: \"kubernetes.io/projected/b88b7105-3d6e-44a7-965a-1fa56297537b-kube-api-access-ggmzj\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.020029 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc" (OuterVolumeSpecName: "kube-api-access-ccwqc") pod "44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" (UID: "44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8"). InnerVolumeSpecName "kube-api-access-ccwqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.024110 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.116637 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts\") pod \"328eb573-b5aa-494f-bc1d-588c43edcb9f\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117201 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts\") pod \"e651b5d7-8b07-42b1-9189-af0896ea5a16\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117278 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts\") pod \"ee2e9091-1bb1-4930-bff8-35c3cced4104\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117417 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts\") pod \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117779 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlsv5\" (UniqueName: \"kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5\") pod \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\" (UID: \"bf31d052-0c6b-4d46-8d72-8c7e247aad00\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117814 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdqtl\" (UniqueName: \"kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl\") pod \"328eb573-b5aa-494f-bc1d-588c43edcb9f\" (UID: \"328eb573-b5aa-494f-bc1d-588c43edcb9f\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117883 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29sx5\" (UniqueName: \"kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5\") pod \"ee2e9091-1bb1-4930-bff8-35c3cced4104\" (UID: \"ee2e9091-1bb1-4930-bff8-35c3cced4104\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.117903 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92wjj\" (UniqueName: \"kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj\") pod \"e651b5d7-8b07-42b1-9189-af0896ea5a16\" (UID: \"e651b5d7-8b07-42b1-9189-af0896ea5a16\") " Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118244 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee2e9091-1bb1-4930-bff8-35c3cced4104" (UID: "ee2e9091-1bb1-4930-bff8-35c3cced4104"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118277 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf31d052-0c6b-4d46-8d72-8c7e247aad00" (UID: "bf31d052-0c6b-4d46-8d72-8c7e247aad00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118287 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e651b5d7-8b07-42b1-9189-af0896ea5a16" (UID: "e651b5d7-8b07-42b1-9189-af0896ea5a16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118870 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e651b5d7-8b07-42b1-9189-af0896ea5a16-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118894 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2e9091-1bb1-4930-bff8-35c3cced4104-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118906 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwqc\" (UniqueName: \"kubernetes.io/projected/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8-kube-api-access-ccwqc\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.118917 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf31d052-0c6b-4d46-8d72-8c7e247aad00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.120777 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "328eb573-b5aa-494f-bc1d-588c43edcb9f" (UID: "328eb573-b5aa-494f-bc1d-588c43edcb9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.123350 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5" (OuterVolumeSpecName: "kube-api-access-hlsv5") pod "bf31d052-0c6b-4d46-8d72-8c7e247aad00" (UID: "bf31d052-0c6b-4d46-8d72-8c7e247aad00"). InnerVolumeSpecName "kube-api-access-hlsv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.128441 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl" (OuterVolumeSpecName: "kube-api-access-xdqtl") pod "328eb573-b5aa-494f-bc1d-588c43edcb9f" (UID: "328eb573-b5aa-494f-bc1d-588c43edcb9f"). InnerVolumeSpecName "kube-api-access-xdqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.132133 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj" (OuterVolumeSpecName: "kube-api-access-92wjj") pod "e651b5d7-8b07-42b1-9189-af0896ea5a16" (UID: "e651b5d7-8b07-42b1-9189-af0896ea5a16"). InnerVolumeSpecName "kube-api-access-92wjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.138600 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5" (OuterVolumeSpecName: "kube-api-access-29sx5") pod "ee2e9091-1bb1-4930-bff8-35c3cced4104" (UID: "ee2e9091-1bb1-4930-bff8-35c3cced4104"). InnerVolumeSpecName "kube-api-access-29sx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147349 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416681-vtzbw"] Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147800 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2e9091-1bb1-4930-bff8-35c3cced4104" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147819 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2e9091-1bb1-4930-bff8-35c3cced4104" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147839 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b7105-3d6e-44a7-965a-1fa56297537b" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147846 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b7105-3d6e-44a7-965a-1fa56297537b" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147856 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147862 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147879 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e651b5d7-8b07-42b1-9189-af0896ea5a16" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147884 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e651b5d7-8b07-42b1-9189-af0896ea5a16" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147897 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31d052-0c6b-4d46-8d72-8c7e247aad00" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147904 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31d052-0c6b-4d46-8d72-8c7e247aad00" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: E1206 06:01:00.147922 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328eb573-b5aa-494f-bc1d-588c43edcb9f" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.147928 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="328eb573-b5aa-494f-bc1d-588c43edcb9f" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148103 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="328eb573-b5aa-494f-bc1d-588c43edcb9f" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148112 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2e9091-1bb1-4930-bff8-35c3cced4104" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148124 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31d052-0c6b-4d46-8d72-8c7e247aad00" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148137 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e651b5d7-8b07-42b1-9189-af0896ea5a16" containerName="mariadb-database-create" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148144 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88b7105-3d6e-44a7-965a-1fa56297537b" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148154 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" containerName="mariadb-account-create-update" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.148853 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.163743 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416681-vtzbw"] Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.221515 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.221584 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.221757 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.221969 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4sf\" (UniqueName: \"kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.222489 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlsv5\" (UniqueName: \"kubernetes.io/projected/bf31d052-0c6b-4d46-8d72-8c7e247aad00-kube-api-access-hlsv5\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.222512 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdqtl\" (UniqueName: \"kubernetes.io/projected/328eb573-b5aa-494f-bc1d-588c43edcb9f-kube-api-access-xdqtl\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.222523 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29sx5\" (UniqueName: \"kubernetes.io/projected/ee2e9091-1bb1-4930-bff8-35c3cced4104-kube-api-access-29sx5\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.222535 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92wjj\" (UniqueName: \"kubernetes.io/projected/e651b5d7-8b07-42b1-9189-af0896ea5a16-kube-api-access-92wjj\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.222549 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328eb573-b5aa-494f-bc1d-588c43edcb9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.325128 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.325201 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.325402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.325498 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4sf\" (UniqueName: \"kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.331111 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.332589 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.333750 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.338097 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.343330 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4sf\" (UniqueName: \"kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf\") pod \"keystone-cron-29416681-vtzbw\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.399326 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerStarted","Data":"f82374468787ca5715533c5258e782cfc98c75a63c22b3e6e6281cffb2d5ce82"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.401645 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csq6r" event={"ID":"ee2e9091-1bb1-4930-bff8-35c3cced4104","Type":"ContainerDied","Data":"a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.401682 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csq6r" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.401691 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0be51de4dc5aea5c9c6a316aee2e08c5a7c79d66c7485dd43b039a90c8fe07e" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.403727 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pz8w7" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.403668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pz8w7" event={"ID":"328eb573-b5aa-494f-bc1d-588c43edcb9f","Type":"ContainerDied","Data":"6b510090b7625e14e853cec12589802540367d91165833d308958cbe33274ac6"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.403974 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b510090b7625e14e853cec12589802540367d91165833d308958cbe33274ac6" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.405409 4733 generic.go:334] "Generic (PLEG): container finished" podID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerID="4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3" exitCode=143 Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.405446 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerDied","Data":"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.407627 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn5h8" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.407627 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn5h8" event={"ID":"e651b5d7-8b07-42b1-9189-af0896ea5a16","Type":"ContainerDied","Data":"88dbdaa725c3a8255b1570a626bb866e476d8792ce70bdb278e0195fe1d8b05b"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.407841 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88dbdaa725c3a8255b1570a626bb866e476d8792ce70bdb278e0195fe1d8b05b" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.409406 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7733-account-create-update-tvk6f" event={"ID":"b88b7105-3d6e-44a7-965a-1fa56297537b","Type":"ContainerDied","Data":"dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.409455 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd15cea8da3c196bfc72f87bd7043803c1f7704d37e909dd3868e94296f9b98" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.409547 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7733-account-create-update-tvk6f" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.410788 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.410780 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-be38-account-create-update-gbx8w" event={"ID":"bf31d052-0c6b-4d46-8d72-8c7e247aad00","Type":"ContainerDied","Data":"7cdc3bcc101844cd24bec020eb7bcf98040a1a309c78d2f519f95a295c78e7b3"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.411150 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cdc3bcc101844cd24bec020eb7bcf98040a1a309c78d2f519f95a295c78e7b3" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.411831 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0451-account-create-update-kldhp" event={"ID":"44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8","Type":"ContainerDied","Data":"68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19"} Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.411860 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d4a074f3a7a8effb06bc2fb3e753adc3b00305da4a8dba9b2fb7c842c1af19" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.411883 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0451-account-create-update-kldhp" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.470807 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.507836 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05eb385-58e5-4206-8fc3-8fda3efc9457" path="/var/lib/kubelet/pods/f05eb385-58e5-4206-8fc3-8fda3efc9457/volumes" Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.621226 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.833750 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.834029 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-log" containerID="cri-o://495e4e31d32d088c287120beeba2fca7f8c1caa0e3ca134c8216be79a4714852" gracePeriod=30 Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.834542 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-httpd" containerID="cri-o://584852523d5d205e570e24ef9ab94df6ea3bc964b7750689522895caac2f9eb1" gracePeriod=30 Dec 06 06:01:00 crc kubenswrapper[4733]: W1206 06:01:00.917666 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c9830c_0996_408f_bb43_8d6e2d0eaa2a.slice/crio-87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16 WatchSource:0}: Error finding container 87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16: Status 404 returned error can't find the container with id 87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16 Dec 06 06:01:00 crc kubenswrapper[4733]: I1206 06:01:00.926352 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416681-vtzbw"] Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.423793 4733 generic.go:334] "Generic (PLEG): container finished" podID="a0063321-2625-4e9d-a536-38104f7d5879" containerID="495e4e31d32d088c287120beeba2fca7f8c1caa0e3ca134c8216be79a4714852" exitCode=143 Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.423875 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerDied","Data":"495e4e31d32d088c287120beeba2fca7f8c1caa0e3ca134c8216be79a4714852"} Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.425729 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416681-vtzbw" event={"ID":"56c9830c-0996-408f-bb43-8d6e2d0eaa2a","Type":"ContainerStarted","Data":"7407b840e38175bfab982409d42bcd949c3a7ddcdb0e84d72f45f00f0249dc38"} Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.425832 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416681-vtzbw" event={"ID":"56c9830c-0996-408f-bb43-8d6e2d0eaa2a","Type":"ContainerStarted","Data":"87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16"} Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.427190 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerStarted","Data":"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00"} Dec 06 06:01:01 crc kubenswrapper[4733]: I1206 06:01:01.442747 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416681-vtzbw" podStartSLOduration=1.442728458 podStartE2EDuration="1.442728458s" podCreationTimestamp="2025-12-06 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:01.438931638 +0000 UTC m=+1045.304142749" watchObservedRunningTime="2025-12-06 06:01:01.442728458 +0000 UTC m=+1045.307939569" Dec 06 06:01:02 crc kubenswrapper[4733]: I1206 06:01:02.440915 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerStarted","Data":"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7"} Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.416506 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.452409 4733 generic.go:334] "Generic (PLEG): container finished" podID="56c9830c-0996-408f-bb43-8d6e2d0eaa2a" containerID="7407b840e38175bfab982409d42bcd949c3a7ddcdb0e84d72f45f00f0249dc38" exitCode=0 Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.452500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416681-vtzbw" event={"ID":"56c9830c-0996-408f-bb43-8d6e2d0eaa2a","Type":"ContainerDied","Data":"7407b840e38175bfab982409d42bcd949c3a7ddcdb0e84d72f45f00f0249dc38"} Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.467902 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerStarted","Data":"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0"} Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.470099 4733 generic.go:334] "Generic (PLEG): container finished" podID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerID="3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2" exitCode=0 Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.470136 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerDied","Data":"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2"} Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.470159 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1dfba234-4e13-4f22-96cf-7f945f11d36e","Type":"ContainerDied","Data":"f78c0a5db911f9b841b6c6f6186b6fc41b2b9e4bc3cd3dda8cb55d74a098abc0"} Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.470175 4733 scope.go:117] "RemoveContainer" containerID="3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.470289 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.492467 4733 scope.go:117] "RemoveContainer" containerID="4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.507957 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.507982 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508013 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508034 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508051 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdhrk\" (UniqueName: \"kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508476 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508298 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508741 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508825 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs\") pod \"1dfba234-4e13-4f22-96cf-7f945f11d36e\" (UID: \"1dfba234-4e13-4f22-96cf-7f945f11d36e\") " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.508838 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs" (OuterVolumeSpecName: "logs") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.509738 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.509758 4733 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dfba234-4e13-4f22-96cf-7f945f11d36e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.515653 4733 scope.go:117] "RemoveContainer" containerID="3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2" Dec 06 06:01:03 crc kubenswrapper[4733]: E1206 06:01:03.515968 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2\": container with ID starting with 3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2 not found: ID does not exist" containerID="3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.515995 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2"} err="failed to get container status \"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2\": rpc error: code = NotFound desc = could not find container \"3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2\": container with ID starting with 3f7a463e72992757246426f5aefa2b7fc372f23d7026624a258d101eb90648e2 not found: ID does not exist" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.516011 4733 scope.go:117] "RemoveContainer" containerID="4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3" Dec 06 06:01:03 crc kubenswrapper[4733]: E1206 06:01:03.516259 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3\": container with ID starting with 4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3 not found: ID does not exist" containerID="4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.516281 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3"} err="failed to get container status \"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3\": rpc error: code = NotFound desc = could not find container \"4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3\": container with ID starting with 4a0f64d3ec2bba190a1052de1a5091d00a0730e065efe11899f16fb2b395d1e3 not found: ID does not exist" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.520118 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.523954 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk" (OuterVolumeSpecName: "kube-api-access-mdhrk") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "kube-api-access-mdhrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.527118 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts" (OuterVolumeSpecName: "scripts") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.555417 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.568834 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data" (OuterVolumeSpecName: "config-data") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.587144 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1dfba234-4e13-4f22-96cf-7f945f11d36e" (UID: "1dfba234-4e13-4f22-96cf-7f945f11d36e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618826 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618856 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618869 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdhrk\" (UniqueName: \"kubernetes.io/projected/1dfba234-4e13-4f22-96cf-7f945f11d36e-kube-api-access-mdhrk\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618882 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618891 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.618899 4733 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dfba234-4e13-4f22-96cf-7f945f11d36e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.635209 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.720794 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.796351 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.804059 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.815532 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:01:03 crc kubenswrapper[4733]: E1206 06:01:03.815950 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-httpd" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.815968 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-httpd" Dec 06 06:01:03 crc kubenswrapper[4733]: E1206 06:01:03.816005 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-log" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.816011 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-log" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.816205 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-httpd" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.816239 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" containerName="glance-log" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.821318 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.823259 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.823476 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.835075 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.924922 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925182 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-config-data\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925231 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-scripts\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925497 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925566 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925628 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-logs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925713 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:03 crc kubenswrapper[4733]: I1206 06:01:03.925766 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhw94\" (UniqueName: \"kubernetes.io/projected/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-kube-api-access-qhw94\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028103 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-config-data\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028195 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-scripts\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028375 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028416 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028472 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-logs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028526 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028572 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhw94\" (UniqueName: \"kubernetes.io/projected/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-kube-api-access-qhw94\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.028604 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.029638 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.030501 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-logs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.030500 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.033638 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.036029 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-scripts\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.038919 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.042186 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-config-data\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.048242 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhw94\" (UniqueName: \"kubernetes.io/projected/d54b6c3c-a2f1-45a1-97f3-a9e95b37f075-kube-api-access-qhw94\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.065646 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075\") " pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.135298 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.489337 4733 generic.go:334] "Generic (PLEG): container finished" podID="a0063321-2625-4e9d-a536-38104f7d5879" containerID="584852523d5d205e570e24ef9ab94df6ea3bc964b7750689522895caac2f9eb1" exitCode=0 Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.494958 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfba234-4e13-4f22-96cf-7f945f11d36e" path="/var/lib/kubelet/pods/1dfba234-4e13-4f22-96cf-7f945f11d36e/volumes" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.495587 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerDied","Data":"584852523d5d205e570e24ef9ab94df6ea3bc964b7750689522895caac2f9eb1"} Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.723180 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.842534 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle\") pod \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.842602 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data\") pod \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.842693 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys\") pod \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.842942 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4sf\" (UniqueName: \"kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf\") pod \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\" (UID: \"56c9830c-0996-408f-bb43-8d6e2d0eaa2a\") " Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.846353 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "56c9830c-0996-408f-bb43-8d6e2d0eaa2a" (UID: "56c9830c-0996-408f-bb43-8d6e2d0eaa2a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.847114 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf" (OuterVolumeSpecName: "kube-api-access-sb4sf") pod "56c9830c-0996-408f-bb43-8d6e2d0eaa2a" (UID: "56c9830c-0996-408f-bb43-8d6e2d0eaa2a"). InnerVolumeSpecName "kube-api-access-sb4sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.862775 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c9830c-0996-408f-bb43-8d6e2d0eaa2a" (UID: "56c9830c-0996-408f-bb43-8d6e2d0eaa2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.877292 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data" (OuterVolumeSpecName: "config-data") pod "56c9830c-0996-408f-bb43-8d6e2d0eaa2a" (UID: "56c9830c-0996-408f-bb43-8d6e2d0eaa2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.924998 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.944878 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4sf\" (UniqueName: \"kubernetes.io/projected/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-kube-api-access-sb4sf\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.944908 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.944921 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:04 crc kubenswrapper[4733]: I1206 06:01:04.944931 4733 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c9830c-0996-408f-bb43-8d6e2d0eaa2a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.046462 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.046856 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.046892 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.047124 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.047168 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.047247 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvdt\" (UniqueName: \"kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.047353 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.047394 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle\") pod \"a0063321-2625-4e9d-a536-38104f7d5879\" (UID: \"a0063321-2625-4e9d-a536-38104f7d5879\") " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.048712 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs" (OuterVolumeSpecName: "logs") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.049106 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.056025 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.056257 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt" (OuterVolumeSpecName: "kube-api-access-vtvdt") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "kube-api-access-vtvdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.059566 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts" (OuterVolumeSpecName: "scripts") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.081017 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.101854 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.104471 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data" (OuterVolumeSpecName: "config-data") pod "a0063321-2625-4e9d-a536-38104f7d5879" (UID: "a0063321-2625-4e9d-a536-38104f7d5879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151694 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151727 4733 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151740 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtvdt\" (UniqueName: \"kubernetes.io/projected/a0063321-2625-4e9d-a536-38104f7d5879-kube-api-access-vtvdt\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151754 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151765 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151774 4733 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151785 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0063321-2625-4e9d-a536-38104f7d5879-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.151793 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0063321-2625-4e9d-a536-38104f7d5879-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.168860 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.254198 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.451160 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 06:01:05 crc kubenswrapper[4733]: W1206 06:01:05.455577 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd54b6c3c_a2f1_45a1_97f3_a9e95b37f075.slice/crio-be2a631722730a277393eab37efbe66cdf8e2d21b6b27ef4ac683a4bb8599ad9 WatchSource:0}: Error finding container be2a631722730a277393eab37efbe66cdf8e2d21b6b27ef4ac683a4bb8599ad9: Status 404 returned error can't find the container with id be2a631722730a277393eab37efbe66cdf8e2d21b6b27ef4ac683a4bb8599ad9 Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.506241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0063321-2625-4e9d-a536-38104f7d5879","Type":"ContainerDied","Data":"2b0ce20663c6a658b0355e15034e57f7723b14fd4e5aa6eee46ca3382bd868c2"} Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.506293 4733 scope.go:117] "RemoveContainer" containerID="584852523d5d205e570e24ef9ab94df6ea3bc964b7750689522895caac2f9eb1" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.506414 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.513451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075","Type":"ContainerStarted","Data":"be2a631722730a277393eab37efbe66cdf8e2d21b6b27ef4ac683a4bb8599ad9"} Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.515425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416681-vtzbw" event={"ID":"56c9830c-0996-408f-bb43-8d6e2d0eaa2a","Type":"ContainerDied","Data":"87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16"} Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.515465 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87849f1769634c17dccbded406429e1985560d521c6bd9ed654dcf370994dd16" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.515507 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416681-vtzbw" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524487 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerStarted","Data":"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03"} Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524610 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-central-agent" containerID="cri-o://f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00" gracePeriod=30 Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524850 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524899 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="proxy-httpd" containerID="cri-o://6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03" gracePeriod=30 Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524945 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="sg-core" containerID="cri-o://57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0" gracePeriod=30 Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.524980 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-notification-agent" containerID="cri-o://baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7" gracePeriod=30 Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.551244 4733 scope.go:117] "RemoveContainer" containerID="495e4e31d32d088c287120beeba2fca7f8c1caa0e3ca134c8216be79a4714852" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.569371 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.583227 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.590896 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.133665149 podStartE2EDuration="6.590882642s" podCreationTimestamp="2025-12-06 06:00:59 +0000 UTC" firstStartedPulling="2025-12-06 06:01:00.340427664 +0000 UTC m=+1044.205638775" lastFinishedPulling="2025-12-06 06:01:04.797645157 +0000 UTC m=+1048.662856268" observedRunningTime="2025-12-06 06:01:05.56152689 +0000 UTC m=+1049.426738001" watchObservedRunningTime="2025-12-06 06:01:05.590882642 +0000 UTC m=+1049.456093752" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.591212 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:05 crc kubenswrapper[4733]: E1206 06:01:05.591646 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c9830c-0996-408f-bb43-8d6e2d0eaa2a" containerName="keystone-cron" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592002 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c9830c-0996-408f-bb43-8d6e2d0eaa2a" containerName="keystone-cron" Dec 06 06:01:05 crc kubenswrapper[4733]: E1206 06:01:05.592039 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-log" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592047 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-log" Dec 06 06:01:05 crc kubenswrapper[4733]: E1206 06:01:05.592060 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-httpd" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592067 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-httpd" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592242 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-log" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592258 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c9830c-0996-408f-bb43-8d6e2d0eaa2a" containerName="keystone-cron" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.592276 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0063321-2625-4e9d-a536-38104f7d5879" containerName="glance-httpd" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.593713 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.599028 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.599247 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.608967 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661078 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661136 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661212 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661321 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnh2\" (UniqueName: \"kubernetes.io/projected/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-kube-api-access-5tnh2\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661355 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661404 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661467 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.661564 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763681 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763820 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763854 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763887 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763973 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnh2\" (UniqueName: \"kubernetes.io/projected/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-kube-api-access-5tnh2\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.763999 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.764042 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.764091 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.764159 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.764381 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.764394 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.772957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.773773 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.775125 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.775207 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.785121 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnh2\" (UniqueName: \"kubernetes.io/projected/4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe-kube-api-access-5tnh2\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.789232 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe\") " pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.876539 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gn769"] Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.877821 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.879899 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.879931 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.881724 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vdgq5" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.892677 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gn769"] Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.925393 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.968068 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.968410 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2kgd\" (UniqueName: \"kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.968600 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:05 crc kubenswrapper[4733]: I1206 06:01:05.968849 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.071356 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.071472 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.071524 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kgd\" (UniqueName: \"kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.071639 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.075958 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.077979 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.082852 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.088034 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kgd\" (UniqueName: \"kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd\") pod \"nova-cell0-conductor-db-sync-gn769\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.194819 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.432570 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.511154 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0063321-2625-4e9d-a536-38104f7d5879" path="/var/lib/kubelet/pods/a0063321-2625-4e9d-a536-38104f7d5879/volumes" Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.558865 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe","Type":"ContainerStarted","Data":"6ff925c8c7467a081385a07827221082d87de9aff00d0ee51f133c3350e787a2"} Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.562707 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075","Type":"ContainerStarted","Data":"a0233669d8a6060fa0b94c442fed72a38f90e02dc20ffa0a6f015e06206a1eb8"} Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567537 4733 generic.go:334] "Generic (PLEG): container finished" podID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerID="6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03" exitCode=0 Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567569 4733 generic.go:334] "Generic (PLEG): container finished" podID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerID="57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0" exitCode=2 Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567576 4733 generic.go:334] "Generic (PLEG): container finished" podID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerID="baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7" exitCode=0 Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567621 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerDied","Data":"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03"} Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567692 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerDied","Data":"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0"} Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.567704 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerDied","Data":"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7"} Dec 06 06:01:06 crc kubenswrapper[4733]: I1206 06:01:06.633968 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gn769"] Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.590800 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe","Type":"ContainerStarted","Data":"f993088dbfcf15a6fc66ea38d736608b96c7845f72043198896284bfbaab1b71"} Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.591046 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe","Type":"ContainerStarted","Data":"d0345821ec1d655566c36e97792210fa6659e111a8ea3f9ddb07af03a4194d0b"} Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.593425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d54b6c3c-a2f1-45a1-97f3-a9e95b37f075","Type":"ContainerStarted","Data":"e02a80d78be06ce30a2884962a71f15e984d01f32e90dc6f5eb009b1cafaad97"} Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.595999 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gn769" event={"ID":"8505186d-6d93-4df5-8f82-4b7747029d79","Type":"ContainerStarted","Data":"199940e2a26f3715198cac89139fcd4beef52543c22af1a8038f3a06928b9e03"} Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.617449 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.617422472 podStartE2EDuration="2.617422472s" podCreationTimestamp="2025-12-06 06:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:07.612563195 +0000 UTC m=+1051.477774306" watchObservedRunningTime="2025-12-06 06:01:07.617422472 +0000 UTC m=+1051.482633584" Dec 06 06:01:07 crc kubenswrapper[4733]: I1206 06:01:07.640636 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.640607301 podStartE2EDuration="4.640607301s" podCreationTimestamp="2025-12-06 06:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:07.62944438 +0000 UTC m=+1051.494655490" watchObservedRunningTime="2025-12-06 06:01:07.640607301 +0000 UTC m=+1051.505818412" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.466031 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.575793 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576054 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576186 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576316 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rgvk\" (UniqueName: \"kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576423 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576571 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576672 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml\") pod \"487a854c-dc9b-430b-b00f-ff16ec35d999\" (UID: \"487a854c-dc9b-430b-b00f-ff16ec35d999\") " Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.576430 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.577599 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.577885 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.596234 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk" (OuterVolumeSpecName: "kube-api-access-8rgvk") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "kube-api-access-8rgvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.596945 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts" (OuterVolumeSpecName: "scripts") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.606015 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.639600 4733 generic.go:334] "Generic (PLEG): container finished" podID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerID="f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00" exitCode=0 Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.639659 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerDied","Data":"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00"} Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.639698 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.639718 4733 scope.go:117] "RemoveContainer" containerID="6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.639703 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"487a854c-dc9b-430b-b00f-ff16ec35d999","Type":"ContainerDied","Data":"f82374468787ca5715533c5258e782cfc98c75a63c22b3e6e6281cffb2d5ce82"} Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.647798 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.674091 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data" (OuterVolumeSpecName: "config-data") pod "487a854c-dc9b-430b-b00f-ff16ec35d999" (UID: "487a854c-dc9b-430b-b00f-ff16ec35d999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.678995 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/487a854c-dc9b-430b-b00f-ff16ec35d999-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.679026 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.679040 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rgvk\" (UniqueName: \"kubernetes.io/projected/487a854c-dc9b-430b-b00f-ff16ec35d999-kube-api-access-8rgvk\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.679053 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.679062 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.679071 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/487a854c-dc9b-430b-b00f-ff16ec35d999-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.680525 4733 scope.go:117] "RemoveContainer" containerID="57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.704220 4733 scope.go:117] "RemoveContainer" containerID="baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.725458 4733 scope.go:117] "RemoveContainer" containerID="f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.745694 4733 scope.go:117] "RemoveContainer" containerID="6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.746454 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03\": container with ID starting with 6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03 not found: ID does not exist" containerID="6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.746515 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03"} err="failed to get container status \"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03\": rpc error: code = NotFound desc = could not find container \"6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03\": container with ID starting with 6a4b94379d2b26f2fb35a86ca47f888ce9e1cbd6a1d05ec9527c974b907c5f03 not found: ID does not exist" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.746584 4733 scope.go:117] "RemoveContainer" containerID="57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.747004 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0\": container with ID starting with 57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0 not found: ID does not exist" containerID="57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.747039 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0"} err="failed to get container status \"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0\": rpc error: code = NotFound desc = could not find container \"57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0\": container with ID starting with 57f2bba81913d27e2df95c5926c466ae30fc56bccc90dbcb7528c0a1219efea0 not found: ID does not exist" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.747081 4733 scope.go:117] "RemoveContainer" containerID="baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.747603 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7\": container with ID starting with baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7 not found: ID does not exist" containerID="baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.747658 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7"} err="failed to get container status \"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7\": rpc error: code = NotFound desc = could not find container \"baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7\": container with ID starting with baa7bc5dd7d95261eb369128ab23592953af6691f118cf3d522c4e366b1b4cf7 not found: ID does not exist" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.747691 4733 scope.go:117] "RemoveContainer" containerID="f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.748043 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00\": container with ID starting with f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00 not found: ID does not exist" containerID="f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.748076 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00"} err="failed to get container status \"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00\": rpc error: code = NotFound desc = could not find container \"f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00\": container with ID starting with f85149eb08adece19573229d1ab4f9ae859cd47d30d89fa7f4d446361d48fa00 not found: ID does not exist" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.969501 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.977511 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.989787 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.990166 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="proxy-httpd" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990186 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="proxy-httpd" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.990217 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-central-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990224 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-central-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.990247 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="sg-core" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990253 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="sg-core" Dec 06 06:01:10 crc kubenswrapper[4733]: E1206 06:01:10.990259 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-notification-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990266 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-notification-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990522 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-central-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990542 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="proxy-httpd" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990554 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="ceilometer-notification-agent" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.990567 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" containerName="sg-core" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.992120 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.993810 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.994503 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:01:10 crc kubenswrapper[4733]: I1206 06:01:10.997534 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.086824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087217 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087357 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087647 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.087761 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzk2\" (UniqueName: \"kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190463 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190528 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190566 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190596 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190616 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzk2\" (UniqueName: \"kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190660 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.190700 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.191221 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.191894 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.196162 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.196189 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.196508 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.196712 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.204573 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzk2\" (UniqueName: \"kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2\") pod \"ceilometer-0\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.308611 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:11 crc kubenswrapper[4733]: I1206 06:01:11.717983 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:11 crc kubenswrapper[4733]: W1206 06:01:11.730414 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78f90d56_a644_41b1_96e1_444f2e9f33a6.slice/crio-e65647ca881ec1c92d6b3cb1e97151d57dda6174a94070876c1a17a3701750b6 WatchSource:0}: Error finding container e65647ca881ec1c92d6b3cb1e97151d57dda6174a94070876c1a17a3701750b6: Status 404 returned error can't find the container with id e65647ca881ec1c92d6b3cb1e97151d57dda6174a94070876c1a17a3701750b6 Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.496730 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487a854c-dc9b-430b-b00f-ff16ec35d999" path="/var/lib/kubelet/pods/487a854c-dc9b-430b-b00f-ff16ec35d999/volumes" Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.666968 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerStarted","Data":"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca"} Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.667018 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerStarted","Data":"e65647ca881ec1c92d6b3cb1e97151d57dda6174a94070876c1a17a3701750b6"} Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.989148 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.989512 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.989569 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.990455 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:01:12 crc kubenswrapper[4733]: I1206 06:01:12.990525 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f" gracePeriod=600 Dec 06 06:01:13 crc kubenswrapper[4733]: I1206 06:01:13.690615 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f" exitCode=0 Dec 06 06:01:13 crc kubenswrapper[4733]: I1206 06:01:13.690677 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f"} Dec 06 06:01:13 crc kubenswrapper[4733]: I1206 06:01:13.691403 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298"} Dec 06 06:01:13 crc kubenswrapper[4733]: I1206 06:01:13.691452 4733 scope.go:117] "RemoveContainer" containerID="b3765b8a99d4ffd713a8095a13f219f1dd20e90b6c9d92ac7d89fa928662bfb0" Dec 06 06:01:13 crc kubenswrapper[4733]: I1206 06:01:13.695459 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerStarted","Data":"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60"} Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.136345 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.136401 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.168247 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.176490 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.713973 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 06:01:14 crc kubenswrapper[4733]: I1206 06:01:14.715033 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 06:01:15 crc kubenswrapper[4733]: I1206 06:01:15.925813 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:15 crc kubenswrapper[4733]: I1206 06:01:15.926101 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:15 crc kubenswrapper[4733]: I1206 06:01:15.955915 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:15 crc kubenswrapper[4733]: I1206 06:01:15.959320 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:16 crc kubenswrapper[4733]: I1206 06:01:16.283504 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 06:01:16 crc kubenswrapper[4733]: I1206 06:01:16.298376 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 06:01:16 crc kubenswrapper[4733]: I1206 06:01:16.733097 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:16 crc kubenswrapper[4733]: I1206 06:01:16.733143 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:18 crc kubenswrapper[4733]: I1206 06:01:18.321168 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:18 crc kubenswrapper[4733]: I1206 06:01:18.325359 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 06:01:19 crc kubenswrapper[4733]: I1206 06:01:19.763466 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gn769" event={"ID":"8505186d-6d93-4df5-8f82-4b7747029d79","Type":"ContainerStarted","Data":"bb806fa724e81e863e82b0d9a73d29bde6697f64a38350cb9df031d384a0183e"} Dec 06 06:01:19 crc kubenswrapper[4733]: I1206 06:01:19.767155 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerStarted","Data":"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907"} Dec 06 06:01:19 crc kubenswrapper[4733]: I1206 06:01:19.785962 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gn769" podStartSLOduration=2.451422237 podStartE2EDuration="14.785946036s" podCreationTimestamp="2025-12-06 06:01:05 +0000 UTC" firstStartedPulling="2025-12-06 06:01:06.642286987 +0000 UTC m=+1050.507498098" lastFinishedPulling="2025-12-06 06:01:18.976810786 +0000 UTC m=+1062.842021897" observedRunningTime="2025-12-06 06:01:19.783111165 +0000 UTC m=+1063.648322276" watchObservedRunningTime="2025-12-06 06:01:19.785946036 +0000 UTC m=+1063.651157147" Dec 06 06:01:20 crc kubenswrapper[4733]: I1206 06:01:20.780417 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerStarted","Data":"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d"} Dec 06 06:01:20 crc kubenswrapper[4733]: I1206 06:01:20.780948 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:01:20 crc kubenswrapper[4733]: I1206 06:01:20.807450 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262791997 podStartE2EDuration="10.807433073s" podCreationTimestamp="2025-12-06 06:01:10 +0000 UTC" firstStartedPulling="2025-12-06 06:01:11.733350066 +0000 UTC m=+1055.598561178" lastFinishedPulling="2025-12-06 06:01:20.277991143 +0000 UTC m=+1064.143202254" observedRunningTime="2025-12-06 06:01:20.798845615 +0000 UTC m=+1064.664056726" watchObservedRunningTime="2025-12-06 06:01:20.807433073 +0000 UTC m=+1064.672644184" Dec 06 06:01:25 crc kubenswrapper[4733]: I1206 06:01:25.826571 4733 generic.go:334] "Generic (PLEG): container finished" podID="8505186d-6d93-4df5-8f82-4b7747029d79" containerID="bb806fa724e81e863e82b0d9a73d29bde6697f64a38350cb9df031d384a0183e" exitCode=0 Dec 06 06:01:25 crc kubenswrapper[4733]: I1206 06:01:25.826659 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gn769" event={"ID":"8505186d-6d93-4df5-8f82-4b7747029d79","Type":"ContainerDied","Data":"bb806fa724e81e863e82b0d9a73d29bde6697f64a38350cb9df031d384a0183e"} Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.136525 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.187041 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle\") pod \"8505186d-6d93-4df5-8f82-4b7747029d79\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.187290 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data\") pod \"8505186d-6d93-4df5-8f82-4b7747029d79\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.187456 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2kgd\" (UniqueName: \"kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd\") pod \"8505186d-6d93-4df5-8f82-4b7747029d79\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.187698 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts\") pod \"8505186d-6d93-4df5-8f82-4b7747029d79\" (UID: \"8505186d-6d93-4df5-8f82-4b7747029d79\") " Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.195076 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd" (OuterVolumeSpecName: "kube-api-access-s2kgd") pod "8505186d-6d93-4df5-8f82-4b7747029d79" (UID: "8505186d-6d93-4df5-8f82-4b7747029d79"). InnerVolumeSpecName "kube-api-access-s2kgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.195629 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts" (OuterVolumeSpecName: "scripts") pod "8505186d-6d93-4df5-8f82-4b7747029d79" (UID: "8505186d-6d93-4df5-8f82-4b7747029d79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.215702 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8505186d-6d93-4df5-8f82-4b7747029d79" (UID: "8505186d-6d93-4df5-8f82-4b7747029d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.226164 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data" (OuterVolumeSpecName: "config-data") pod "8505186d-6d93-4df5-8f82-4b7747029d79" (UID: "8505186d-6d93-4df5-8f82-4b7747029d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.290576 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.290610 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.290623 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8505186d-6d93-4df5-8f82-4b7747029d79-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.290635 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2kgd\" (UniqueName: \"kubernetes.io/projected/8505186d-6d93-4df5-8f82-4b7747029d79-kube-api-access-s2kgd\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.848456 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gn769" event={"ID":"8505186d-6d93-4df5-8f82-4b7747029d79","Type":"ContainerDied","Data":"199940e2a26f3715198cac89139fcd4beef52543c22af1a8038f3a06928b9e03"} Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.848523 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199940e2a26f3715198cac89139fcd4beef52543c22af1a8038f3a06928b9e03" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.848555 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gn769" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.944826 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 06:01:27 crc kubenswrapper[4733]: E1206 06:01:27.945429 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8505186d-6d93-4df5-8f82-4b7747029d79" containerName="nova-cell0-conductor-db-sync" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.945460 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8505186d-6d93-4df5-8f82-4b7747029d79" containerName="nova-cell0-conductor-db-sync" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.945741 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8505186d-6d93-4df5-8f82-4b7747029d79" containerName="nova-cell0-conductor-db-sync" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.946604 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.949330 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vdgq5" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.949580 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 06:01:27 crc kubenswrapper[4733]: I1206 06:01:27.963093 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.003778 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.004038 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gwz\" (UniqueName: \"kubernetes.io/projected/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-kube-api-access-x2gwz\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.004605 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.105740 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gwz\" (UniqueName: \"kubernetes.io/projected/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-kube-api-access-x2gwz\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.105907 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.105964 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.112577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.112610 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.119519 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gwz\" (UniqueName: \"kubernetes.io/projected/c0f33a48-03f3-4580-8cfb-e6cc7d720ba4-kube-api-access-x2gwz\") pod \"nova-cell0-conductor-0\" (UID: \"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4\") " pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.264087 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.668899 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 06:01:28 crc kubenswrapper[4733]: W1206 06:01:28.670477 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f33a48_03f3_4580_8cfb_e6cc7d720ba4.slice/crio-7a24e8700b0a3bde663b2223c8545da7c65e794eff8c4f276960ec586f4bd4bb WatchSource:0}: Error finding container 7a24e8700b0a3bde663b2223c8545da7c65e794eff8c4f276960ec586f4bd4bb: Status 404 returned error can't find the container with id 7a24e8700b0a3bde663b2223c8545da7c65e794eff8c4f276960ec586f4bd4bb Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.859264 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4","Type":"ContainerStarted","Data":"71d7c26992f17f62b1ec775ee4f1dfac3ea8a11aa66c55984f5307a32325ce31"} Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.859323 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c0f33a48-03f3-4580-8cfb-e6cc7d720ba4","Type":"ContainerStarted","Data":"7a24e8700b0a3bde663b2223c8545da7c65e794eff8c4f276960ec586f4bd4bb"} Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.859433 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:28 crc kubenswrapper[4733]: I1206 06:01:28.875226 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.875211883 podStartE2EDuration="1.875211883s" podCreationTimestamp="2025-12-06 06:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:28.86921653 +0000 UTC m=+1072.734427641" watchObservedRunningTime="2025-12-06 06:01:28.875211883 +0000 UTC m=+1072.740422994" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.284934 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.760152 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-t7j5b"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.761256 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.764462 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.765591 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.770364 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t7j5b"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.804504 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4w7b\" (UniqueName: \"kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.804597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.804659 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.804860 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.859694 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.861314 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.863324 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.870719 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906618 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4w7b\" (UniqueName: \"kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906685 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906722 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906749 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906772 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906795 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsd7m\" (UniqueName: \"kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906859 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.906967 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.913488 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.915358 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.929029 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.938953 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4w7b\" (UniqueName: \"kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b\") pod \"nova-cell0-cell-mapping-t7j5b\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.946065 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.947497 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.949242 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.958923 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.960102 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.963275 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 06:01:33 crc kubenswrapper[4733]: I1206 06:01:33.966885 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:33.999966 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009391 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4f5\" (UniqueName: \"kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009458 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009493 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nkw\" (UniqueName: \"kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009885 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsd7m\" (UniqueName: \"kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009935 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.009956 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.010007 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.012461 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.018887 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.018951 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.041287 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.042844 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.046753 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.046862 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsd7m\" (UniqueName: \"kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m\") pod \"nova-api-0\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.066312 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.082722 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.111904 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4f5\" (UniqueName: \"kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112254 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nkw\" (UniqueName: \"kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112299 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112517 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112570 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112599 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vk2\" (UniqueName: \"kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112690 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112708 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.112776 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.118463 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.129851 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.135205 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.137316 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.140976 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nkw\" (UniqueName: \"kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw\") pod \"nova-scheduler-0\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.142032 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4f5\" (UniqueName: \"kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.185515 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.188545 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.190067 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.214889 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.215007 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.215056 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.215082 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vk2\" (UniqueName: \"kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.218604 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.218682 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.220172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.226779 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.245603 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vk2\" (UniqueName: \"kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2\") pod \"nova-metadata-0\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.301422 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.312718 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.317905 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.317995 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.318020 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.318079 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.318237 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.318337 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx7gg\" (UniqueName: \"kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423018 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423100 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423175 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423208 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.423235 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx7gg\" (UniqueName: \"kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.425293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.425847 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.426392 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.427880 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.428692 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.447010 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx7gg\" (UniqueName: \"kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg\") pod \"dnsmasq-dns-75c5fc6955-c8bh2\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.447621 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.529561 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.627625 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t7j5b"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.676066 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.766257 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: W1206 06:01:34.778750 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d609f49_867e_48f0_a336_45e16b4b718d.slice/crio-982ea0352f7b5e106fcaa7a93c89a17fdb634bb321bc88dc1b337d582af7350e WatchSource:0}: Error finding container 982ea0352f7b5e106fcaa7a93c89a17fdb634bb321bc88dc1b337d582af7350e: Status 404 returned error can't find the container with id 982ea0352f7b5e106fcaa7a93c89a17fdb634bb321bc88dc1b337d582af7350e Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.844987 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.929537 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerStarted","Data":"e6789aa6a29bc0261d65e19f399d6c742edbb65a5c86e15537c213c9fe38cf79"} Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.932074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t7j5b" event={"ID":"f3ff2c7d-6450-448f-8824-36d2b8ea0710","Type":"ContainerStarted","Data":"b4d70349bed9c7c1e08cebcf962e114e20b1616ced83cf951b92d4a8bcedfe26"} Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.932105 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t7j5b" event={"ID":"f3ff2c7d-6450-448f-8824-36d2b8ea0710","Type":"ContainerStarted","Data":"e6183eec2a83513545f0f7bc05333be026cebb3665f640d6d3324733416a3a46"} Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.933507 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d609f49-867e-48f0-a336-45e16b4b718d","Type":"ContainerStarted","Data":"982ea0352f7b5e106fcaa7a93c89a17fdb634bb321bc88dc1b337d582af7350e"} Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.934954 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63b655fb-451c-4bc2-ad3e-665b3fdd61b1","Type":"ContainerStarted","Data":"63cc48146e37529870245ff7bfb8f44da94de15a753e0508a4f020046f96e130"} Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.951689 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:01:34 crc kubenswrapper[4733]: W1206 06:01:34.952004 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cda4590_6b0b_4213_9f39_21056b6bc142.slice/crio-3cc45cee44eee47162ae63fb52f1dfa826d0221363f8c2b6002ea491267840dc WatchSource:0}: Error finding container 3cc45cee44eee47162ae63fb52f1dfa826d0221363f8c2b6002ea491267840dc: Status 404 returned error can't find the container with id 3cc45cee44eee47162ae63fb52f1dfa826d0221363f8c2b6002ea491267840dc Dec 06 06:01:34 crc kubenswrapper[4733]: I1206 06:01:34.953182 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-t7j5b" podStartSLOduration=1.9531641309999999 podStartE2EDuration="1.953164131s" podCreationTimestamp="2025-12-06 06:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:34.947900362 +0000 UTC m=+1078.813111474" watchObservedRunningTime="2025-12-06 06:01:34.953164131 +0000 UTC m=+1078.818375242" Dec 06 06:01:35 crc kubenswrapper[4733]: W1206 06:01:35.023419 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5c1a3a_d4d2_4f6b_b424_56c23bd5994e.slice/crio-1820b01b75b8129525fd0c4e69fdc6537b8752f726296f061e0826c187d27bb2 WatchSource:0}: Error finding container 1820b01b75b8129525fd0c4e69fdc6537b8752f726296f061e0826c187d27bb2: Status 404 returned error can't find the container with id 1820b01b75b8129525fd0c4e69fdc6537b8752f726296f061e0826c187d27bb2 Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.024336 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.279627 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-22n6m"] Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.281418 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.283144 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.283553 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.288547 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-22n6m"] Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.344742 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.344830 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.344879 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.344946 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.447129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.447201 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.447239 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.447686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.458346 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.458407 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.458420 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.462777 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75\") pod \"nova-cell1-conductor-db-sync-22n6m\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.614769 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.966100 4733 generic.go:334] "Generic (PLEG): container finished" podID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerID="64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1" exitCode=0 Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.966468 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" event={"ID":"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e","Type":"ContainerDied","Data":"64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1"} Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.966498 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" event={"ID":"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e","Type":"ContainerStarted","Data":"1820b01b75b8129525fd0c4e69fdc6537b8752f726296f061e0826c187d27bb2"} Dec 06 06:01:35 crc kubenswrapper[4733]: I1206 06:01:35.973500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerStarted","Data":"3cc45cee44eee47162ae63fb52f1dfa826d0221363f8c2b6002ea491267840dc"} Dec 06 06:01:36 crc kubenswrapper[4733]: I1206 06:01:36.050017 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-22n6m"] Dec 06 06:01:36 crc kubenswrapper[4733]: I1206 06:01:36.985878 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-22n6m" event={"ID":"3b31ce7f-2712-4b95-bc6a-c52f3e104e12","Type":"ContainerStarted","Data":"3bff34d0c82b79f2d616285fd496a585025729765a2518a41b386d095c548d8f"} Dec 06 06:01:36 crc kubenswrapper[4733]: I1206 06:01:36.986177 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-22n6m" event={"ID":"3b31ce7f-2712-4b95-bc6a-c52f3e104e12","Type":"ContainerStarted","Data":"52cc87cea0cc2f57247297371569a857fcbbbfca0885f0703a14b18be471e7f7"} Dec 06 06:01:36 crc kubenswrapper[4733]: I1206 06:01:36.991190 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" event={"ID":"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e","Type":"ContainerStarted","Data":"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e"} Dec 06 06:01:37 crc kubenswrapper[4733]: I1206 06:01:37.004147 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-22n6m" podStartSLOduration=2.004135855 podStartE2EDuration="2.004135855s" podCreationTimestamp="2025-12-06 06:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:37.000124782 +0000 UTC m=+1080.865335903" watchObservedRunningTime="2025-12-06 06:01:37.004135855 +0000 UTC m=+1080.869346966" Dec 06 06:01:37 crc kubenswrapper[4733]: I1206 06:01:37.041735 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" podStartSLOduration=3.041707928 podStartE2EDuration="3.041707928s" podCreationTimestamp="2025-12-06 06:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:37.038782095 +0000 UTC m=+1080.903993206" watchObservedRunningTime="2025-12-06 06:01:37.041707928 +0000 UTC m=+1080.906919038" Dec 06 06:01:37 crc kubenswrapper[4733]: I1206 06:01:37.885866 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:01:37 crc kubenswrapper[4733]: I1206 06:01:37.931625 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:01:37 crc kubenswrapper[4733]: I1206 06:01:37.999581 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.009095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerStarted","Data":"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.009392 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerStarted","Data":"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.012062 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerStarted","Data":"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.012086 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerStarted","Data":"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.012186 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-log" containerID="cri-o://76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316" gracePeriod=30 Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.012527 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-metadata" containerID="cri-o://81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de" gracePeriod=30 Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.017842 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d609f49-867e-48f0-a336-45e16b4b718d","Type":"ContainerStarted","Data":"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.017954 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9d609f49-867e-48f0-a336-45e16b4b718d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e" gracePeriod=30 Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.022151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63b655fb-451c-4bc2-ad3e-665b3fdd61b1","Type":"ContainerStarted","Data":"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf"} Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.043077 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.376800568 podStartE2EDuration="6.043064491s" podCreationTimestamp="2025-12-06 06:01:33 +0000 UTC" firstStartedPulling="2025-12-06 06:01:34.696997913 +0000 UTC m=+1078.562209024" lastFinishedPulling="2025-12-06 06:01:38.363261835 +0000 UTC m=+1082.228472947" observedRunningTime="2025-12-06 06:01:39.026900526 +0000 UTC m=+1082.892111637" watchObservedRunningTime="2025-12-06 06:01:39.043064491 +0000 UTC m=+1082.908275602" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.059014 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.6489220310000001 podStartE2EDuration="5.058995168s" podCreationTimestamp="2025-12-06 06:01:34 +0000 UTC" firstStartedPulling="2025-12-06 06:01:34.954725748 +0000 UTC m=+1078.819936859" lastFinishedPulling="2025-12-06 06:01:38.364798885 +0000 UTC m=+1082.230009996" observedRunningTime="2025-12-06 06:01:39.042869335 +0000 UTC m=+1082.908080445" watchObservedRunningTime="2025-12-06 06:01:39.058995168 +0000 UTC m=+1082.924206279" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.064882 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.488102808 podStartE2EDuration="6.064872801s" podCreationTimestamp="2025-12-06 06:01:33 +0000 UTC" firstStartedPulling="2025-12-06 06:01:34.786631214 +0000 UTC m=+1078.651842325" lastFinishedPulling="2025-12-06 06:01:38.363401207 +0000 UTC m=+1082.228612318" observedRunningTime="2025-12-06 06:01:39.0574856 +0000 UTC m=+1082.922696711" watchObservedRunningTime="2025-12-06 06:01:39.064872801 +0000 UTC m=+1082.930083912" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.302088 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.313202 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.447860 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:01:39 crc kubenswrapper[4733]: I1206 06:01:39.447917 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:01:40 crc kubenswrapper[4733]: I1206 06:01:40.033975 4733 generic.go:334] "Generic (PLEG): container finished" podID="3b31ce7f-2712-4b95-bc6a-c52f3e104e12" containerID="3bff34d0c82b79f2d616285fd496a585025729765a2518a41b386d095c548d8f" exitCode=0 Dec 06 06:01:40 crc kubenswrapper[4733]: I1206 06:01:40.034066 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-22n6m" event={"ID":"3b31ce7f-2712-4b95-bc6a-c52f3e104e12","Type":"ContainerDied","Data":"3bff34d0c82b79f2d616285fd496a585025729765a2518a41b386d095c548d8f"} Dec 06 06:01:40 crc kubenswrapper[4733]: I1206 06:01:40.036987 4733 generic.go:334] "Generic (PLEG): container finished" podID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerID="76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316" exitCode=143 Dec 06 06:01:40 crc kubenswrapper[4733]: I1206 06:01:40.037099 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerDied","Data":"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316"} Dec 06 06:01:40 crc kubenswrapper[4733]: I1206 06:01:40.055257 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.543651226 podStartE2EDuration="7.05524031s" podCreationTimestamp="2025-12-06 06:01:33 +0000 UTC" firstStartedPulling="2025-12-06 06:01:34.850071721 +0000 UTC m=+1078.715282832" lastFinishedPulling="2025-12-06 06:01:38.361660805 +0000 UTC m=+1082.226871916" observedRunningTime="2025-12-06 06:01:39.084550996 +0000 UTC m=+1082.949762106" watchObservedRunningTime="2025-12-06 06:01:40.05524031 +0000 UTC m=+1083.920451421" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.317507 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.342396 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.402647 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75\") pod \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.402979 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data\") pod \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.403022 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts\") pod \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.403066 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle\") pod \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\" (UID: \"3b31ce7f-2712-4b95-bc6a-c52f3e104e12\") " Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.409688 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts" (OuterVolumeSpecName: "scripts") pod "3b31ce7f-2712-4b95-bc6a-c52f3e104e12" (UID: "3b31ce7f-2712-4b95-bc6a-c52f3e104e12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.410519 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75" (OuterVolumeSpecName: "kube-api-access-6vs75") pod "3b31ce7f-2712-4b95-bc6a-c52f3e104e12" (UID: "3b31ce7f-2712-4b95-bc6a-c52f3e104e12"). InnerVolumeSpecName "kube-api-access-6vs75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.427733 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b31ce7f-2712-4b95-bc6a-c52f3e104e12" (UID: "3b31ce7f-2712-4b95-bc6a-c52f3e104e12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.428122 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data" (OuterVolumeSpecName: "config-data") pod "3b31ce7f-2712-4b95-bc6a-c52f3e104e12" (UID: "3b31ce7f-2712-4b95-bc6a-c52f3e104e12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.505795 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.505830 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.505841 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:41 crc kubenswrapper[4733]: I1206 06:01:41.505872 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/3b31ce7f-2712-4b95-bc6a-c52f3e104e12-kube-api-access-6vs75\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.054294 4733 generic.go:334] "Generic (PLEG): container finished" podID="f3ff2c7d-6450-448f-8824-36d2b8ea0710" containerID="b4d70349bed9c7c1e08cebcf962e114e20b1616ced83cf951b92d4a8bcedfe26" exitCode=0 Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.054401 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t7j5b" event={"ID":"f3ff2c7d-6450-448f-8824-36d2b8ea0710","Type":"ContainerDied","Data":"b4d70349bed9c7c1e08cebcf962e114e20b1616ced83cf951b92d4a8bcedfe26"} Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.056727 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-22n6m" event={"ID":"3b31ce7f-2712-4b95-bc6a-c52f3e104e12","Type":"ContainerDied","Data":"52cc87cea0cc2f57247297371569a857fcbbbfca0885f0703a14b18be471e7f7"} Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.056767 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52cc87cea0cc2f57247297371569a857fcbbbfca0885f0703a14b18be471e7f7" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.056811 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-22n6m" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.129558 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 06:01:42 crc kubenswrapper[4733]: E1206 06:01:42.130804 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b31ce7f-2712-4b95-bc6a-c52f3e104e12" containerName="nova-cell1-conductor-db-sync" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.130826 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b31ce7f-2712-4b95-bc6a-c52f3e104e12" containerName="nova-cell1-conductor-db-sync" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.131068 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b31ce7f-2712-4b95-bc6a-c52f3e104e12" containerName="nova-cell1-conductor-db-sync" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.131743 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.133289 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.137003 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.228839 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.229155 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.229544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl8nr\" (UniqueName: \"kubernetes.io/projected/eeac3d18-b33c-41ec-b72d-4300358e4a96-kube-api-access-jl8nr\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.331366 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.331922 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.332061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl8nr\" (UniqueName: \"kubernetes.io/projected/eeac3d18-b33c-41ec-b72d-4300358e4a96-kube-api-access-jl8nr\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.336356 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.337459 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac3d18-b33c-41ec-b72d-4300358e4a96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.347878 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl8nr\" (UniqueName: \"kubernetes.io/projected/eeac3d18-b33c-41ec-b72d-4300358e4a96-kube-api-access-jl8nr\") pod \"nova-cell1-conductor-0\" (UID: \"eeac3d18-b33c-41ec-b72d-4300358e4a96\") " pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.449969 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:42 crc kubenswrapper[4733]: I1206 06:01:42.897062 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.070792 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eeac3d18-b33c-41ec-b72d-4300358e4a96","Type":"ContainerStarted","Data":"6b872a42ef09c68c2667d59028e039d7eca494a9f21d838a0a5a83285fd9ad72"} Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.071542 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eeac3d18-b33c-41ec-b72d-4300358e4a96","Type":"ContainerStarted","Data":"f8b14631e731107462b96f7313e928c36e65798e6ec0a0ea429326bc39c37ff9"} Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.090386 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.090359676 podStartE2EDuration="1.090359676s" podCreationTimestamp="2025-12-06 06:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:43.0884049 +0000 UTC m=+1086.953616011" watchObservedRunningTime="2025-12-06 06:01:43.090359676 +0000 UTC m=+1086.955570787" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.302672 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.354287 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4w7b\" (UniqueName: \"kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b\") pod \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.354532 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle\") pod \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.354587 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data\") pod \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.354836 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts\") pod \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\" (UID: \"f3ff2c7d-6450-448f-8824-36d2b8ea0710\") " Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.360397 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts" (OuterVolumeSpecName: "scripts") pod "f3ff2c7d-6450-448f-8824-36d2b8ea0710" (UID: "f3ff2c7d-6450-448f-8824-36d2b8ea0710"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.361493 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b" (OuterVolumeSpecName: "kube-api-access-v4w7b") pod "f3ff2c7d-6450-448f-8824-36d2b8ea0710" (UID: "f3ff2c7d-6450-448f-8824-36d2b8ea0710"). InnerVolumeSpecName "kube-api-access-v4w7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.380123 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ff2c7d-6450-448f-8824-36d2b8ea0710" (UID: "f3ff2c7d-6450-448f-8824-36d2b8ea0710"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.382694 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data" (OuterVolumeSpecName: "config-data") pod "f3ff2c7d-6450-448f-8824-36d2b8ea0710" (UID: "f3ff2c7d-6450-448f-8824-36d2b8ea0710"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.457959 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.457988 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4w7b\" (UniqueName: \"kubernetes.io/projected/f3ff2c7d-6450-448f-8824-36d2b8ea0710-kube-api-access-v4w7b\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.458000 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:43 crc kubenswrapper[4733]: I1206 06:01:43.458009 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ff2c7d-6450-448f-8824-36d2b8ea0710-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.082360 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t7j5b" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.082336 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t7j5b" event={"ID":"f3ff2c7d-6450-448f-8824-36d2b8ea0710","Type":"ContainerDied","Data":"e6183eec2a83513545f0f7bc05333be026cebb3665f640d6d3324733416a3a46"} Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.082770 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6183eec2a83513545f0f7bc05333be026cebb3665f640d6d3324733416a3a46" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.082798 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.186202 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.186247 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.190191 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.198100 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.198376 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" containerName="nova-scheduler-scheduler" containerID="cri-o://e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf" gracePeriod=30 Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.532136 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.580974 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.581168 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="dnsmasq-dns" containerID="cri-o://ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465" gracePeriod=10 Dec 06 06:01:44 crc kubenswrapper[4733]: I1206 06:01:44.685698 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.013649 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091414 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091462 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" event={"ID":"75df2e76-18b2-4bb7-8069-7636be9b1e46","Type":"ContainerDied","Data":"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465"} Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091472 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091420 4733 generic.go:334] "Generic (PLEG): container finished" podID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerID="ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465" exitCode=0 Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091577 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091584 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f44678f55-nshc4" event={"ID":"75df2e76-18b2-4bb7-8069-7636be9b1e46","Type":"ContainerDied","Data":"840cec72a42f5c9037cb98096adba8d7f53e23bdcb9d7c51bb67ae9433aea7de"} Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091642 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091733 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091756 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64qfl\" (UniqueName: \"kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091875 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb\") pod \"75df2e76-18b2-4bb7-8069-7636be9b1e46\" (UID: \"75df2e76-18b2-4bb7-8069-7636be9b1e46\") " Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.091507 4733 scope.go:117] "RemoveContainer" containerID="ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.092494 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-log" containerID="cri-o://6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2" gracePeriod=30 Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.092514 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-api" containerID="cri-o://94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469" gracePeriod=30 Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.097174 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl" (OuterVolumeSpecName: "kube-api-access-64qfl") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "kube-api-access-64qfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.097182 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": EOF" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.097180 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": EOF" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.155116 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config" (OuterVolumeSpecName: "config") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.155129 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.160758 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.161293 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.181400 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75df2e76-18b2-4bb7-8069-7636be9b1e46" (UID: "75df2e76-18b2-4bb7-8069-7636be9b1e46"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196529 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196558 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196571 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196581 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64qfl\" (UniqueName: \"kubernetes.io/projected/75df2e76-18b2-4bb7-8069-7636be9b1e46-kube-api-access-64qfl\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196595 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.196605 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75df2e76-18b2-4bb7-8069-7636be9b1e46-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.206839 4733 scope.go:117] "RemoveContainer" containerID="f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.238032 4733 scope.go:117] "RemoveContainer" containerID="ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465" Dec 06 06:01:45 crc kubenswrapper[4733]: E1206 06:01:45.238742 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465\": container with ID starting with ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465 not found: ID does not exist" containerID="ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.238783 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465"} err="failed to get container status \"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465\": rpc error: code = NotFound desc = could not find container \"ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465\": container with ID starting with ba86db776b4568a6ae8f6860617f9499056be008b5ecc533c40269b52897a465 not found: ID does not exist" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.238810 4733 scope.go:117] "RemoveContainer" containerID="f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697" Dec 06 06:01:45 crc kubenswrapper[4733]: E1206 06:01:45.239415 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697\": container with ID starting with f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697 not found: ID does not exist" containerID="f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.239472 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697"} err="failed to get container status \"f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697\": rpc error: code = NotFound desc = could not find container \"f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697\": container with ID starting with f7c70437ae525c406ed3aa166edd47a17c592a9ea7810ce18bb85c0aefdb8697 not found: ID does not exist" Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.431257 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.439121 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f44678f55-nshc4"] Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.509177 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.509349 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c00912c1-c2a3-44a7-a71e-e1e123680351" containerName="kube-state-metrics" containerID="cri-o://ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de" gracePeriod=30 Dec 06 06:01:45 crc kubenswrapper[4733]: I1206 06:01:45.934096 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.017334 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2c9c\" (UniqueName: \"kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c\") pod \"c00912c1-c2a3-44a7-a71e-e1e123680351\" (UID: \"c00912c1-c2a3-44a7-a71e-e1e123680351\") " Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.022909 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c" (OuterVolumeSpecName: "kube-api-access-c2c9c") pod "c00912c1-c2a3-44a7-a71e-e1e123680351" (UID: "c00912c1-c2a3-44a7-a71e-e1e123680351"). InnerVolumeSpecName "kube-api-access-c2c9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.055666 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.108619 4733 generic.go:334] "Generic (PLEG): container finished" podID="c00912c1-c2a3-44a7-a71e-e1e123680351" containerID="ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de" exitCode=2 Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.108677 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.108735 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c00912c1-c2a3-44a7-a71e-e1e123680351","Type":"ContainerDied","Data":"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de"} Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.108770 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c00912c1-c2a3-44a7-a71e-e1e123680351","Type":"ContainerDied","Data":"ab2e1643e64ed15266e7eb48172bb1e5544d017fd7343fd850584cd5c17e4ef8"} Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.108788 4733 scope.go:117] "RemoveContainer" containerID="ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.112920 4733 generic.go:334] "Generic (PLEG): container finished" podID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" containerID="e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf" exitCode=0 Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.112975 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.113049 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63b655fb-451c-4bc2-ad3e-665b3fdd61b1","Type":"ContainerDied","Data":"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf"} Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.113112 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63b655fb-451c-4bc2-ad3e-665b3fdd61b1","Type":"ContainerDied","Data":"63cc48146e37529870245ff7bfb8f44da94de15a753e0508a4f020046f96e130"} Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.115646 4733 generic.go:334] "Generic (PLEG): container finished" podID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerID="6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2" exitCode=143 Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.115695 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerDied","Data":"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2"} Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.122226 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle\") pod \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.122272 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nkw\" (UniqueName: \"kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw\") pod \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.122386 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data\") pod \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\" (UID: \"63b655fb-451c-4bc2-ad3e-665b3fdd61b1\") " Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.123076 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2c9c\" (UniqueName: \"kubernetes.io/projected/c00912c1-c2a3-44a7-a71e-e1e123680351-kube-api-access-c2c9c\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.138632 4733 scope.go:117] "RemoveContainer" containerID="ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.139358 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de\": container with ID starting with ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de not found: ID does not exist" containerID="ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.139390 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de"} err="failed to get container status \"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de\": rpc error: code = NotFound desc = could not find container \"ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de\": container with ID starting with ee46518bb06c1465e25ddbb74b813525900c3989fbe6edbdf6f8ab0f4104f3de not found: ID does not exist" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.139407 4733 scope.go:117] "RemoveContainer" containerID="e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.140979 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw" (OuterVolumeSpecName: "kube-api-access-k5nkw") pod "63b655fb-451c-4bc2-ad3e-665b3fdd61b1" (UID: "63b655fb-451c-4bc2-ad3e-665b3fdd61b1"). InnerVolumeSpecName "kube-api-access-k5nkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.146741 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.168991 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data" (OuterVolumeSpecName: "config-data") pod "63b655fb-451c-4bc2-ad3e-665b3fdd61b1" (UID: "63b655fb-451c-4bc2-ad3e-665b3fdd61b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.170398 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.172352 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63b655fb-451c-4bc2-ad3e-665b3fdd61b1" (UID: "63b655fb-451c-4bc2-ad3e-665b3fdd61b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.185988 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.186569 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" containerName="nova-scheduler-scheduler" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186592 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" containerName="nova-scheduler-scheduler" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.186647 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00912c1-c2a3-44a7-a71e-e1e123680351" containerName="kube-state-metrics" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186654 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00912c1-c2a3-44a7-a71e-e1e123680351" containerName="kube-state-metrics" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.186665 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ff2c7d-6450-448f-8824-36d2b8ea0710" containerName="nova-manage" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186674 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff2c7d-6450-448f-8824-36d2b8ea0710" containerName="nova-manage" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.186686 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="dnsmasq-dns" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186693 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="dnsmasq-dns" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.186703 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="init" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186708 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="init" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186918 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00912c1-c2a3-44a7-a71e-e1e123680351" containerName="kube-state-metrics" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186936 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" containerName="dnsmasq-dns" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186952 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ff2c7d-6450-448f-8824-36d2b8ea0710" containerName="nova-manage" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.186961 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" containerName="nova-scheduler-scheduler" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.187728 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.191754 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.192583 4733 scope.go:117] "RemoveContainer" containerID="e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.192641 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.192746 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 06:01:46 crc kubenswrapper[4733]: E1206 06:01:46.197023 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf\": container with ID starting with e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf not found: ID does not exist" containerID="e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.197063 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf"} err="failed to get container status \"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf\": rpc error: code = NotFound desc = could not find container \"e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf\": container with ID starting with e1eb879f8b70f5df5eef8ac58df75d7b74d3a05f0dcc5808c0d429e574cfdabf not found: ID does not exist" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.225570 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.225602 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5nkw\" (UniqueName: \"kubernetes.io/projected/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-kube-api-access-k5nkw\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.225614 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63b655fb-451c-4bc2-ad3e-665b3fdd61b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.327267 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.327533 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.327634 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.327719 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2kcw\" (UniqueName: \"kubernetes.io/projected/02729323-8acf-44d3-8eec-3194d7531769-kube-api-access-s2kcw\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.431812 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.431919 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.431989 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kcw\" (UniqueName: \"kubernetes.io/projected/02729323-8acf-44d3-8eec-3194d7531769-kube-api-access-s2kcw\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.432180 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.452328 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.452350 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.453144 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02729323-8acf-44d3-8eec-3194d7531769-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.455982 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kcw\" (UniqueName: \"kubernetes.io/projected/02729323-8acf-44d3-8eec-3194d7531769-kube-api-access-s2kcw\") pod \"kube-state-metrics-0\" (UID: \"02729323-8acf-44d3-8eec-3194d7531769\") " pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.470923 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.496238 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75df2e76-18b2-4bb7-8069-7636be9b1e46" path="/var/lib/kubelet/pods/75df2e76-18b2-4bb7-8069-7636be9b1e46/volumes" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.496991 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00912c1-c2a3-44a7-a71e-e1e123680351" path="/var/lib/kubelet/pods/c00912c1-c2a3-44a7-a71e-e1e123680351/volumes" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.500141 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.511158 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.512739 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.514138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.515904 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.531578 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.639354 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-469j8\" (UniqueName: \"kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.639574 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.639681 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.741899 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.742087 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-469j8\" (UniqueName: \"kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.742274 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.747533 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.757782 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.762713 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-469j8\" (UniqueName: \"kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8\") pod \"nova-scheduler-0\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " pod="openstack/nova-scheduler-0" Dec 06 06:01:46 crc kubenswrapper[4733]: I1206 06:01:46.835414 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:01:47 crc kubenswrapper[4733]: I1206 06:01:47.140848 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 06:01:47 crc kubenswrapper[4733]: I1206 06:01:47.246367 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:01:47 crc kubenswrapper[4733]: W1206 06:01:47.250179 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4298b279_055f_4e06_812f_358be8714f9e.slice/crio-785c1727de17f5195f1a37c97e36ac04d60e405666263028014d5a0762bcd9a3 WatchSource:0}: Error finding container 785c1727de17f5195f1a37c97e36ac04d60e405666263028014d5a0762bcd9a3: Status 404 returned error can't find the container with id 785c1727de17f5195f1a37c97e36ac04d60e405666263028014d5a0762bcd9a3 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.027739 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.028251 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-central-agent" containerID="cri-o://471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca" gracePeriod=30 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.028379 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="proxy-httpd" containerID="cri-o://d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d" gracePeriod=30 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.028526 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-notification-agent" containerID="cri-o://6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60" gracePeriod=30 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.028600 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="sg-core" containerID="cri-o://028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907" gracePeriod=30 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.152698 4733 generic.go:334] "Generic (PLEG): container finished" podID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerID="028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907" exitCode=2 Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.152766 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerDied","Data":"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907"} Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.154218 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4298b279-055f-4e06-812f-358be8714f9e","Type":"ContainerStarted","Data":"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94"} Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.154249 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4298b279-055f-4e06-812f-358be8714f9e","Type":"ContainerStarted","Data":"785c1727de17f5195f1a37c97e36ac04d60e405666263028014d5a0762bcd9a3"} Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.160901 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02729323-8acf-44d3-8eec-3194d7531769","Type":"ContainerStarted","Data":"02ad3c4d7d24155ec56104357b1e455d33465e571f059b2e0dd12d8328e24e32"} Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.160945 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02729323-8acf-44d3-8eec-3194d7531769","Type":"ContainerStarted","Data":"630fd790c9741cea8d58d3c61dfa0d54af800dea8296071afe14fca8c276cedb"} Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.161897 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.176662 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.176649178 podStartE2EDuration="2.176649178s" podCreationTimestamp="2025-12-06 06:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:48.171700292 +0000 UTC m=+1092.036911404" watchObservedRunningTime="2025-12-06 06:01:48.176649178 +0000 UTC m=+1092.041860290" Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.182992 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.924805219 podStartE2EDuration="2.182982017s" podCreationTimestamp="2025-12-06 06:01:46 +0000 UTC" firstStartedPulling="2025-12-06 06:01:47.147644545 +0000 UTC m=+1091.012855657" lastFinishedPulling="2025-12-06 06:01:47.405821344 +0000 UTC m=+1091.271032455" observedRunningTime="2025-12-06 06:01:48.181487737 +0000 UTC m=+1092.046698848" watchObservedRunningTime="2025-12-06 06:01:48.182982017 +0000 UTC m=+1092.048193129" Dec 06 06:01:48 crc kubenswrapper[4733]: I1206 06:01:48.504796 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b655fb-451c-4bc2-ad3e-665b3fdd61b1" path="/var/lib/kubelet/pods/63b655fb-451c-4bc2-ad3e-665b3fdd61b1/volumes" Dec 06 06:01:49 crc kubenswrapper[4733]: I1206 06:01:49.173857 4733 generic.go:334] "Generic (PLEG): container finished" podID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerID="d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d" exitCode=0 Dec 06 06:01:49 crc kubenswrapper[4733]: I1206 06:01:49.174199 4733 generic.go:334] "Generic (PLEG): container finished" podID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerID="471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca" exitCode=0 Dec 06 06:01:49 crc kubenswrapper[4733]: I1206 06:01:49.173929 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerDied","Data":"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d"} Dec 06 06:01:49 crc kubenswrapper[4733]: I1206 06:01:49.174381 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerDied","Data":"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca"} Dec 06 06:01:50 crc kubenswrapper[4733]: I1206 06:01:50.941149 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.034645 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle\") pod \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.034751 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsd7m\" (UniqueName: \"kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m\") pod \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.035064 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data\") pod \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.035187 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs\") pod \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\" (UID: \"2e9b28fd-e646-4bb8-8d79-bba2898da8f3\") " Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.035704 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs" (OuterVolumeSpecName: "logs") pod "2e9b28fd-e646-4bb8-8d79-bba2898da8f3" (UID: "2e9b28fd-e646-4bb8-8d79-bba2898da8f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.036040 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.041928 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m" (OuterVolumeSpecName: "kube-api-access-wsd7m") pod "2e9b28fd-e646-4bb8-8d79-bba2898da8f3" (UID: "2e9b28fd-e646-4bb8-8d79-bba2898da8f3"). InnerVolumeSpecName "kube-api-access-wsd7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.061522 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data" (OuterVolumeSpecName: "config-data") pod "2e9b28fd-e646-4bb8-8d79-bba2898da8f3" (UID: "2e9b28fd-e646-4bb8-8d79-bba2898da8f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.061541 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e9b28fd-e646-4bb8-8d79-bba2898da8f3" (UID: "2e9b28fd-e646-4bb8-8d79-bba2898da8f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.138753 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.138789 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsd7m\" (UniqueName: \"kubernetes.io/projected/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-kube-api-access-wsd7m\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.138806 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e9b28fd-e646-4bb8-8d79-bba2898da8f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.197191 4733 generic.go:334] "Generic (PLEG): container finished" podID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerID="94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469" exitCode=0 Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.197253 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerDied","Data":"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469"} Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.197340 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e9b28fd-e646-4bb8-8d79-bba2898da8f3","Type":"ContainerDied","Data":"e6789aa6a29bc0261d65e19f399d6c742edbb65a5c86e15537c213c9fe38cf79"} Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.197388 4733 scope.go:117] "RemoveContainer" containerID="94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.197564 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.224625 4733 scope.go:117] "RemoveContainer" containerID="6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.245589 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.259048 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.262503 4733 scope.go:117] "RemoveContainer" containerID="94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469" Dec 06 06:01:51 crc kubenswrapper[4733]: E1206 06:01:51.265060 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469\": container with ID starting with 94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469 not found: ID does not exist" containerID="94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.265101 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469"} err="failed to get container status \"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469\": rpc error: code = NotFound desc = could not find container \"94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469\": container with ID starting with 94a7841c8c854ae4427ab0aa14dec9b0ea3e40d19d03f19a16b81634f5ab0469 not found: ID does not exist" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.265128 4733 scope.go:117] "RemoveContainer" containerID="6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2" Dec 06 06:01:51 crc kubenswrapper[4733]: E1206 06:01:51.265855 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2\": container with ID starting with 6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2 not found: ID does not exist" containerID="6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.265893 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2"} err="failed to get container status \"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2\": rpc error: code = NotFound desc = could not find container \"6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2\": container with ID starting with 6979f3b1a7ab626a50824624495c7b4b4a35da56408b735e9bfccb56613550b2 not found: ID does not exist" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.269166 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:51 crc kubenswrapper[4733]: E1206 06:01:51.269630 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-log" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.269651 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-log" Dec 06 06:01:51 crc kubenswrapper[4733]: E1206 06:01:51.269706 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-api" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.269713 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-api" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.269902 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-api" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.269922 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" containerName="nova-api-log" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.270898 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.272909 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.288312 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.344469 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.344569 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppq5h\" (UniqueName: \"kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.344761 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.344825 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.446725 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.446796 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.446976 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.447028 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppq5h\" (UniqueName: \"kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.447324 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.451015 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.451483 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.461740 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppq5h\" (UniqueName: \"kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h\") pod \"nova-api-0\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.587228 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:01:51 crc kubenswrapper[4733]: I1206 06:01:51.835806 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 06:01:52 crc kubenswrapper[4733]: I1206 06:01:52.033164 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:01:52 crc kubenswrapper[4733]: W1206 06:01:52.049181 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3036f17_599f_40b1_8a0f_37f64940d172.slice/crio-c921aee6346bc6588602d7f52d7aaefa708ceffd95855e196d838e6c4b29fa9b WatchSource:0}: Error finding container c921aee6346bc6588602d7f52d7aaefa708ceffd95855e196d838e6c4b29fa9b: Status 404 returned error can't find the container with id c921aee6346bc6588602d7f52d7aaefa708ceffd95855e196d838e6c4b29fa9b Dec 06 06:01:52 crc kubenswrapper[4733]: I1206 06:01:52.216635 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerStarted","Data":"c921aee6346bc6588602d7f52d7aaefa708ceffd95855e196d838e6c4b29fa9b"} Dec 06 06:01:52 crc kubenswrapper[4733]: I1206 06:01:52.474290 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 06:01:52 crc kubenswrapper[4733]: I1206 06:01:52.497387 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e9b28fd-e646-4bb8-8d79-bba2898da8f3" path="/var/lib/kubelet/pods/2e9b28fd-e646-4bb8-8d79-bba2898da8f3/volumes" Dec 06 06:01:53 crc kubenswrapper[4733]: I1206 06:01:53.228918 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerStarted","Data":"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7"} Dec 06 06:01:53 crc kubenswrapper[4733]: I1206 06:01:53.229228 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerStarted","Data":"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648"} Dec 06 06:01:53 crc kubenswrapper[4733]: I1206 06:01:53.251094 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.25107946 podStartE2EDuration="2.25107946s" podCreationTimestamp="2025-12-06 06:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:01:53.244579197 +0000 UTC m=+1097.109790308" watchObservedRunningTime="2025-12-06 06:01:53.25107946 +0000 UTC m=+1097.116290572" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.783053 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.812771 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.812827 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.812876 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.812897 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxzk2\" (UniqueName: \"kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.813122 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.813171 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.813224 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd\") pod \"78f90d56-a644-41b1-96e1-444f2e9f33a6\" (UID: \"78f90d56-a644-41b1-96e1-444f2e9f33a6\") " Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.813243 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.813575 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.814064 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.814082 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78f90d56-a644-41b1-96e1-444f2e9f33a6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.818392 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts" (OuterVolumeSpecName: "scripts") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.818949 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2" (OuterVolumeSpecName: "kube-api-access-kxzk2") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "kube-api-access-kxzk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.839007 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.889473 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.897810 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data" (OuterVolumeSpecName: "config-data") pod "78f90d56-a644-41b1-96e1-444f2e9f33a6" (UID: "78f90d56-a644-41b1-96e1-444f2e9f33a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.915733 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.915766 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.915778 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxzk2\" (UniqueName: \"kubernetes.io/projected/78f90d56-a644-41b1-96e1-444f2e9f33a6-kube-api-access-kxzk2\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.915790 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:54 crc kubenswrapper[4733]: I1206 06:01:54.915799 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f90d56-a644-41b1-96e1-444f2e9f33a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.253366 4733 generic.go:334] "Generic (PLEG): container finished" podID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerID="6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60" exitCode=0 Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.253466 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerDied","Data":"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60"} Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.253546 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.253587 4733 scope.go:117] "RemoveContainer" containerID="d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.253567 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78f90d56-a644-41b1-96e1-444f2e9f33a6","Type":"ContainerDied","Data":"e65647ca881ec1c92d6b3cb1e97151d57dda6174a94070876c1a17a3701750b6"} Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.293544 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.311620 4733 scope.go:117] "RemoveContainer" containerID="028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.335007 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.342583 4733 scope.go:117] "RemoveContainer" containerID="6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.347575 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.348041 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-central-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348060 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-central-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.348083 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="sg-core" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348090 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="sg-core" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.348100 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="proxy-httpd" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348106 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="proxy-httpd" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.348170 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-notification-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348178 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-notification-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348455 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="proxy-httpd" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348478 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-notification-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348495 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="ceilometer-central-agent" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.348598 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" containerName="sg-core" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.350773 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.352827 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.353212 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.353928 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.362829 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.364118 4733 scope.go:117] "RemoveContainer" containerID="471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.385921 4733 scope.go:117] "RemoveContainer" containerID="d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.386327 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d\": container with ID starting with d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d not found: ID does not exist" containerID="d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386414 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d"} err="failed to get container status \"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d\": rpc error: code = NotFound desc = could not find container \"d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d\": container with ID starting with d8b41a540f8399adb498958e09f5d792f073040f62e19c6e222cec1ffecdcb1d not found: ID does not exist" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386442 4733 scope.go:117] "RemoveContainer" containerID="028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.386697 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907\": container with ID starting with 028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907 not found: ID does not exist" containerID="028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386727 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907"} err="failed to get container status \"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907\": rpc error: code = NotFound desc = could not find container \"028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907\": container with ID starting with 028d82c407eb17260f64bccee9790bc25eb726d59b9dfe842c91fc41fe1e4907 not found: ID does not exist" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386742 4733 scope.go:117] "RemoveContainer" containerID="6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.386954 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60\": container with ID starting with 6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60 not found: ID does not exist" containerID="6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386975 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60"} err="failed to get container status \"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60\": rpc error: code = NotFound desc = could not find container \"6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60\": container with ID starting with 6442d8f4872c0128cf51b414602a7ab9a38b80ac9f07449e578729732a454d60 not found: ID does not exist" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.386988 4733 scope.go:117] "RemoveContainer" containerID="471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca" Dec 06 06:01:55 crc kubenswrapper[4733]: E1206 06:01:55.387191 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca\": container with ID starting with 471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca not found: ID does not exist" containerID="471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.387209 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca"} err="failed to get container status \"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca\": rpc error: code = NotFound desc = could not find container \"471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca\": container with ID starting with 471ad16305ff7dd11d6ff8a1300f9ebb587bbd5a2add173f27c850452b80fbca not found: ID does not exist" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.542215 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.542390 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.542428 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.543037 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.543144 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.543190 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.543239 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bx9\" (UniqueName: \"kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.543336 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645739 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645817 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645902 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bx9\" (UniqueName: \"kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.645986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.646052 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.646098 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.646756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.647677 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.653140 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.653336 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.653618 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.653688 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.654278 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.663364 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bx9\" (UniqueName: \"kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9\") pod \"ceilometer-0\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " pod="openstack/ceilometer-0" Dec 06 06:01:55 crc kubenswrapper[4733]: I1206 06:01:55.668955 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.110154 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.267543 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerStarted","Data":"ebe1007c2fb1fcf3ee6de872bffe71a9bf6bd74953e403cc1217648c863edaa2"} Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.495434 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f90d56-a644-41b1-96e1-444f2e9f33a6" path="/var/lib/kubelet/pods/78f90d56-a644-41b1-96e1-444f2e9f33a6/volumes" Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.522280 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.835112 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 06:01:56 crc kubenswrapper[4733]: I1206 06:01:56.862002 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 06:01:57 crc kubenswrapper[4733]: I1206 06:01:57.285604 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerStarted","Data":"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b"} Dec 06 06:01:57 crc kubenswrapper[4733]: I1206 06:01:57.312488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 06:01:58 crc kubenswrapper[4733]: I1206 06:01:58.300290 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerStarted","Data":"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8"} Dec 06 06:01:59 crc kubenswrapper[4733]: I1206 06:01:59.319478 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerStarted","Data":"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd"} Dec 06 06:02:00 crc kubenswrapper[4733]: I1206 06:02:00.332138 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerStarted","Data":"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032"} Dec 06 06:02:00 crc kubenswrapper[4733]: I1206 06:02:00.332852 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:02:00 crc kubenswrapper[4733]: I1206 06:02:00.364567 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.597289046 podStartE2EDuration="5.364536133s" podCreationTimestamp="2025-12-06 06:01:55 +0000 UTC" firstStartedPulling="2025-12-06 06:01:56.121720704 +0000 UTC m=+1099.986931815" lastFinishedPulling="2025-12-06 06:01:59.888967791 +0000 UTC m=+1103.754178902" observedRunningTime="2025-12-06 06:02:00.352938456 +0000 UTC m=+1104.218149566" watchObservedRunningTime="2025-12-06 06:02:00.364536133 +0000 UTC m=+1104.229747245" Dec 06 06:02:01 crc kubenswrapper[4733]: I1206 06:02:01.587632 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:02:01 crc kubenswrapper[4733]: I1206 06:02:01.587961 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:02:02 crc kubenswrapper[4733]: I1206 06:02:02.670521 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:02 crc kubenswrapper[4733]: I1206 06:02:02.670709 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.409916 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.412707 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.425386 4733 generic.go:334] "Generic (PLEG): container finished" podID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerID="81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de" exitCode=137 Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.425427 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerDied","Data":"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de"} Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.425468 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.425509 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1cda4590-6b0b-4213-9f39-21056b6bc142","Type":"ContainerDied","Data":"3cc45cee44eee47162ae63fb52f1dfa826d0221363f8c2b6002ea491267840dc"} Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.425535 4733 scope.go:117] "RemoveContainer" containerID="81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.429211 4733 generic.go:334] "Generic (PLEG): container finished" podID="9d609f49-867e-48f0-a336-45e16b4b718d" containerID="a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e" exitCode=137 Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.429250 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d609f49-867e-48f0-a336-45e16b4b718d","Type":"ContainerDied","Data":"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e"} Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.429284 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d609f49-867e-48f0-a336-45e16b4b718d","Type":"ContainerDied","Data":"982ea0352f7b5e106fcaa7a93c89a17fdb634bb321bc88dc1b337d582af7350e"} Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.429386 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.463480 4733 scope.go:117] "RemoveContainer" containerID="76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.482919 4733 scope.go:117] "RemoveContainer" containerID="81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de" Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.483370 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de\": container with ID starting with 81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de not found: ID does not exist" containerID="81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.483417 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de"} err="failed to get container status \"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de\": rpc error: code = NotFound desc = could not find container \"81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de\": container with ID starting with 81ea0683ade4b2e2c4e1656c1839db55ec1510d4e6cb6763b0a7d4a5c4b9e8de not found: ID does not exist" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.483443 4733 scope.go:117] "RemoveContainer" containerID="76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316" Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.483883 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316\": container with ID starting with 76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316 not found: ID does not exist" containerID="76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.483920 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316"} err="failed to get container status \"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316\": rpc error: code = NotFound desc = could not find container \"76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316\": container with ID starting with 76b5f90388ad3646a4e5078c0ed28f3a0517b9e32876dc2fba036607cd28a316 not found: ID does not exist" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.483946 4733 scope.go:117] "RemoveContainer" containerID="a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.501574 4733 scope.go:117] "RemoveContainer" containerID="a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e" Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.501911 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e\": container with ID starting with a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e not found: ID does not exist" containerID="a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.501953 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e"} err="failed to get container status \"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e\": rpc error: code = NotFound desc = could not find container \"a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e\": container with ID starting with a80e1b03e664c6c6fda8bf9814be8a6603c8079d650941d4003d0ed4b15f4b6e not found: ID does not exist" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.551744 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9vk2\" (UniqueName: \"kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2\") pod \"1cda4590-6b0b-4213-9f39-21056b6bc142\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.551804 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle\") pod \"1cda4590-6b0b-4213-9f39-21056b6bc142\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.551854 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k4f5\" (UniqueName: \"kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5\") pod \"9d609f49-867e-48f0-a336-45e16b4b718d\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.552043 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data\") pod \"1cda4590-6b0b-4213-9f39-21056b6bc142\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.552166 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs\") pod \"1cda4590-6b0b-4213-9f39-21056b6bc142\" (UID: \"1cda4590-6b0b-4213-9f39-21056b6bc142\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.552218 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle\") pod \"9d609f49-867e-48f0-a336-45e16b4b718d\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.552293 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data\") pod \"9d609f49-867e-48f0-a336-45e16b4b718d\" (UID: \"9d609f49-867e-48f0-a336-45e16b4b718d\") " Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.552952 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs" (OuterVolumeSpecName: "logs") pod "1cda4590-6b0b-4213-9f39-21056b6bc142" (UID: "1cda4590-6b0b-4213-9f39-21056b6bc142"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.559069 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5" (OuterVolumeSpecName: "kube-api-access-5k4f5") pod "9d609f49-867e-48f0-a336-45e16b4b718d" (UID: "9d609f49-867e-48f0-a336-45e16b4b718d"). InnerVolumeSpecName "kube-api-access-5k4f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.559668 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2" (OuterVolumeSpecName: "kube-api-access-t9vk2") pod "1cda4590-6b0b-4213-9f39-21056b6bc142" (UID: "1cda4590-6b0b-4213-9f39-21056b6bc142"). InnerVolumeSpecName "kube-api-access-t9vk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.580545 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cda4590-6b0b-4213-9f39-21056b6bc142" (UID: "1cda4590-6b0b-4213-9f39-21056b6bc142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.581348 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data" (OuterVolumeSpecName: "config-data") pod "1cda4590-6b0b-4213-9f39-21056b6bc142" (UID: "1cda4590-6b0b-4213-9f39-21056b6bc142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.581948 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d609f49-867e-48f0-a336-45e16b4b718d" (UID: "9d609f49-867e-48f0-a336-45e16b4b718d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.581975 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data" (OuterVolumeSpecName: "config-data") pod "9d609f49-867e-48f0-a336-45e16b4b718d" (UID: "9d609f49-867e-48f0-a336-45e16b4b718d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656520 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656549 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cda4590-6b0b-4213-9f39-21056b6bc142-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656561 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656574 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d609f49-867e-48f0-a336-45e16b4b718d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656586 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9vk2\" (UniqueName: \"kubernetes.io/projected/1cda4590-6b0b-4213-9f39-21056b6bc142-kube-api-access-t9vk2\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656596 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda4590-6b0b-4213-9f39-21056b6bc142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.656606 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k4f5\" (UniqueName: \"kubernetes.io/projected/9d609f49-867e-48f0-a336-45e16b4b718d-kube-api-access-5k4f5\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.758169 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.763088 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.768980 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.772671 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.777886 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.778239 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-log" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778261 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-log" Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.778280 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d609f49-867e-48f0-a336-45e16b4b718d" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778286 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d609f49-867e-48f0-a336-45e16b4b718d" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 06:02:09 crc kubenswrapper[4733]: E1206 06:02:09.778333 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-metadata" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778346 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-metadata" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778515 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-metadata" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778539 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d609f49-867e-48f0-a336-45e16b4b718d" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.778548 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" containerName="nova-metadata-log" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.779420 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.780970 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.783693 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.794472 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.806547 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.807777 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.808984 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.815887 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.816073 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.839365 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.862649 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2zgl\" (UniqueName: \"kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.862747 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.862790 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.862892 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.862965 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964671 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964791 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964870 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964902 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.964954 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.965145 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.965324 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2zgl\" (UniqueName: \"kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.965373 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.965375 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbngt\" (UniqueName: \"kubernetes.io/projected/76922258-485d-4796-9f72-528ec9ec5b24-kube-api-access-jbngt\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.965463 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.969844 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.969883 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.970886 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:09 crc kubenswrapper[4733]: I1206 06:02:09.979611 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2zgl\" (UniqueName: \"kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl\") pod \"nova-metadata-0\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " pod="openstack/nova-metadata-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.067506 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.067576 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.067662 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbngt\" (UniqueName: \"kubernetes.io/projected/76922258-485d-4796-9f72-528ec9ec5b24-kube-api-access-jbngt\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.067687 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.067720 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.071332 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.071469 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.071690 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.072642 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76922258-485d-4796-9f72-528ec9ec5b24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.082188 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbngt\" (UniqueName: \"kubernetes.io/projected/76922258-485d-4796-9f72-528ec9ec5b24-kube-api-access-jbngt\") pod \"nova-cell1-novncproxy-0\" (UID: \"76922258-485d-4796-9f72-528ec9ec5b24\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.099351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.120960 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.496657 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cda4590-6b0b-4213-9f39-21056b6bc142" path="/var/lib/kubelet/pods/1cda4590-6b0b-4213-9f39-21056b6bc142/volumes" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.497909 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d609f49-867e-48f0-a336-45e16b4b718d" path="/var/lib/kubelet/pods/9d609f49-867e-48f0-a336-45e16b4b718d/volumes" Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.513487 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:10 crc kubenswrapper[4733]: W1206 06:02:10.514842 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c8ded0_d473_485a_99c5_3cebe4c806af.slice/crio-ff128d04b5ae285032893282db2db7971d49809fb95bddf4525a19cb8061d652 WatchSource:0}: Error finding container ff128d04b5ae285032893282db2db7971d49809fb95bddf4525a19cb8061d652: Status 404 returned error can't find the container with id ff128d04b5ae285032893282db2db7971d49809fb95bddf4525a19cb8061d652 Dec 06 06:02:10 crc kubenswrapper[4733]: I1206 06:02:10.564517 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.466471 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76922258-485d-4796-9f72-528ec9ec5b24","Type":"ContainerStarted","Data":"bdaec22394b8b47d484943a2e2a9cb4649d0ba14ed8c954e7c4714cc2fcc9c41"} Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.466936 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76922258-485d-4796-9f72-528ec9ec5b24","Type":"ContainerStarted","Data":"bd139ccc7bbfa2e4d466596acbfd32583950bb5d16aa1df113f2004798812d60"} Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.469210 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerStarted","Data":"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af"} Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.469297 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerStarted","Data":"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2"} Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.469370 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerStarted","Data":"ff128d04b5ae285032893282db2db7971d49809fb95bddf4525a19cb8061d652"} Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.482580 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.48256033 podStartE2EDuration="2.48256033s" podCreationTimestamp="2025-12-06 06:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:11.479741388 +0000 UTC m=+1115.344952499" watchObservedRunningTime="2025-12-06 06:02:11.48256033 +0000 UTC m=+1115.347771441" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.593270 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.593942 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.594283 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.594339 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.597977 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.605577 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.618799 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.618781191 podStartE2EDuration="2.618781191s" podCreationTimestamp="2025-12-06 06:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:11.507804879 +0000 UTC m=+1115.373015991" watchObservedRunningTime="2025-12-06 06:02:11.618781191 +0000 UTC m=+1115.483992302" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.761320 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.763012 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.777937 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.911999 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.912050 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.912093 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.912182 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.912206 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:11 crc kubenswrapper[4733]: I1206 06:02:11.912238 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8697p\" (UniqueName: \"kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014005 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014054 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014094 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8697p\" (UniqueName: \"kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014212 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014246 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.014274 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.015541 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.015570 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.015541 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.015660 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.016687 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.035800 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8697p\" (UniqueName: \"kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p\") pod \"dnsmasq-dns-6cb95df969-pdwnz\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.096685 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:12 crc kubenswrapper[4733]: I1206 06:02:12.595872 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:02:13 crc kubenswrapper[4733]: I1206 06:02:13.491409 4733 generic.go:334] "Generic (PLEG): container finished" podID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerID="b54dae8a87702e821d4a6dcaa1c7f570637285800099616179f2c53594853b92" exitCode=0 Dec 06 06:02:13 crc kubenswrapper[4733]: I1206 06:02:13.491618 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" event={"ID":"2f21ac21-5975-4205-9564-c2cfe24bd9ea","Type":"ContainerDied","Data":"b54dae8a87702e821d4a6dcaa1c7f570637285800099616179f2c53594853b92"} Dec 06 06:02:13 crc kubenswrapper[4733]: I1206 06:02:13.492600 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" event={"ID":"2f21ac21-5975-4205-9564-c2cfe24bd9ea","Type":"ContainerStarted","Data":"2e8ad2760129bc7a602492378f192f5576e03b47ce23e11b1c7fa9ccd608fed7"} Dec 06 06:02:13 crc kubenswrapper[4733]: I1206 06:02:13.702404 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.149774 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.150076 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-central-agent" containerID="cri-o://db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.150432 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="proxy-httpd" containerID="cri-o://55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.150781 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="sg-core" containerID="cri-o://0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.150603 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-notification-agent" containerID="cri-o://bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.169754 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.191:3000/\": EOF" Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.507785 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" event={"ID":"2f21ac21-5975-4205-9564-c2cfe24bd9ea","Type":"ContainerStarted","Data":"d7a3bc0d32b5f0a4d09449c68899e79a0af2bd04e23bd38c3a503690f26937ac"} Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.507983 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511266 4733 generic.go:334] "Generic (PLEG): container finished" podID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerID="55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032" exitCode=0 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511346 4733 generic.go:334] "Generic (PLEG): container finished" podID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerID="0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd" exitCode=2 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511357 4733 generic.go:334] "Generic (PLEG): container finished" podID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerID="db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b" exitCode=0 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511435 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerDied","Data":"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032"} Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511481 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerDied","Data":"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd"} Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511498 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerDied","Data":"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b"} Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511572 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-log" containerID="cri-o://feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.511654 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-api" containerID="cri-o://37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7" gracePeriod=30 Dec 06 06:02:14 crc kubenswrapper[4733]: I1206 06:02:14.540410 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" podStartSLOduration=3.540395 podStartE2EDuration="3.540395s" podCreationTimestamp="2025-12-06 06:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:14.5317921 +0000 UTC m=+1118.397003212" watchObservedRunningTime="2025-12-06 06:02:14.540395 +0000 UTC m=+1118.405606111" Dec 06 06:02:15 crc kubenswrapper[4733]: I1206 06:02:15.100035 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:02:15 crc kubenswrapper[4733]: I1206 06:02:15.100660 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:02:15 crc kubenswrapper[4733]: I1206 06:02:15.121774 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:15 crc kubenswrapper[4733]: I1206 06:02:15.535707 4733 generic.go:334] "Generic (PLEG): container finished" podID="e3036f17-599f-40b1-8a0f-37f64940d172" containerID="feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648" exitCode=143 Dec 06 06:02:15 crc kubenswrapper[4733]: I1206 06:02:15.536681 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerDied","Data":"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648"} Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.418845 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536034 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536150 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536226 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536294 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536333 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536354 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536392 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536440 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.536519 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bx9\" (UniqueName: \"kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9\") pod \"0114cbad-3939-4254-98c5-34a0b36b5ff1\" (UID: \"0114cbad-3939-4254-98c5-34a0b36b5ff1\") " Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.537150 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.537886 4733 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.537913 4733 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0114cbad-3939-4254-98c5-34a0b36b5ff1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.541555 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9" (OuterVolumeSpecName: "kube-api-access-t4bx9") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "kube-api-access-t4bx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.541988 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts" (OuterVolumeSpecName: "scripts") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.562909 4733 generic.go:334] "Generic (PLEG): container finished" podID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerID="bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8" exitCode=0 Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.562988 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerDied","Data":"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8"} Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.563022 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.563043 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0114cbad-3939-4254-98c5-34a0b36b5ff1","Type":"ContainerDied","Data":"ebe1007c2fb1fcf3ee6de872bffe71a9bf6bd74953e403cc1217648c863edaa2"} Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.563077 4733 scope.go:117] "RemoveContainer" containerID="55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.564351 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.589431 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.609177 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.625822 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data" (OuterVolumeSpecName: "config-data") pod "0114cbad-3939-4254-98c5-34a0b36b5ff1" (UID: "0114cbad-3939-4254-98c5-34a0b36b5ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.637894 4733 scope.go:117] "RemoveContainer" containerID="0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640765 4733 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640799 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640812 4733 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640826 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640837 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0114cbad-3939-4254-98c5-34a0b36b5ff1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.640847 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bx9\" (UniqueName: \"kubernetes.io/projected/0114cbad-3939-4254-98c5-34a0b36b5ff1-kube-api-access-t4bx9\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.663419 4733 scope.go:117] "RemoveContainer" containerID="bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.717613 4733 scope.go:117] "RemoveContainer" containerID="db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.755197 4733 scope.go:117] "RemoveContainer" containerID="55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.755619 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032\": container with ID starting with 55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032 not found: ID does not exist" containerID="55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.755658 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032"} err="failed to get container status \"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032\": rpc error: code = NotFound desc = could not find container \"55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032\": container with ID starting with 55d78d5a74118ad32535a6abc87392ac1e1db1c2fc8830f1f5a81e1715e4f032 not found: ID does not exist" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.755685 4733 scope.go:117] "RemoveContainer" containerID="0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.756093 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd\": container with ID starting with 0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd not found: ID does not exist" containerID="0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.756158 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd"} err="failed to get container status \"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd\": rpc error: code = NotFound desc = could not find container \"0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd\": container with ID starting with 0247f849a058d728a5ccefa2f1ef9b17837a2bb7bf303c24e8349ffeac1b7efd not found: ID does not exist" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.756200 4733 scope.go:117] "RemoveContainer" containerID="bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.756843 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8\": container with ID starting with bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8 not found: ID does not exist" containerID="bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.756881 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8"} err="failed to get container status \"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8\": rpc error: code = NotFound desc = could not find container \"bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8\": container with ID starting with bd604da8f8465377c82b5717898a958e01e384be783d9c569fb0c1886cc70fd8 not found: ID does not exist" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.756901 4733 scope.go:117] "RemoveContainer" containerID="db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.757210 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b\": container with ID starting with db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b not found: ID does not exist" containerID="db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.757250 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b"} err="failed to get container status \"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b\": rpc error: code = NotFound desc = could not find container \"db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b\": container with ID starting with db992e2e52ac48178f7414bd5f263625ff531c1d4b821d4dad79c3a3750b0d0b not found: ID does not exist" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.904764 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.922353 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929031 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.929547 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-central-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929569 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-central-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.929609 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="proxy-httpd" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929619 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="proxy-httpd" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.929654 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="sg-core" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929662 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="sg-core" Dec 06 06:02:17 crc kubenswrapper[4733]: E1206 06:02:17.929676 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-notification-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929683 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-notification-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929903 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="proxy-httpd" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929934 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-notification-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929946 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="ceilometer-central-agent" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.929967 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" containerName="sg-core" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.931795 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.934357 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.934791 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.935255 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.935825 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.947953 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948000 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-scripts\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948137 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-run-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948223 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-config-data\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948240 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948276 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgvt\" (UniqueName: \"kubernetes.io/projected/1edd6e2c-20a9-4584-aa48-64021a2911d3-kube-api-access-krgvt\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948328 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-log-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.948372 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:17 crc kubenswrapper[4733]: I1206 06:02:17.996653 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.050673 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-run-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051093 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-run-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051100 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-config-data\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051366 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051446 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgvt\" (UniqueName: \"kubernetes.io/projected/1edd6e2c-20a9-4584-aa48-64021a2911d3-kube-api-access-krgvt\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051531 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-log-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.051998 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.052141 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.052174 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-scripts\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.052948 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1edd6e2c-20a9-4584-aa48-64021a2911d3-log-httpd\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.057296 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.058283 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-config-data\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.058403 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.058775 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-scripts\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.073692 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edd6e2c-20a9-4584-aa48-64021a2911d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.077380 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgvt\" (UniqueName: \"kubernetes.io/projected/1edd6e2c-20a9-4584-aa48-64021a2911d3-kube-api-access-krgvt\") pod \"ceilometer-0\" (UID: \"1edd6e2c-20a9-4584-aa48-64021a2911d3\") " pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.153839 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppq5h\" (UniqueName: \"kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h\") pod \"e3036f17-599f-40b1-8a0f-37f64940d172\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.153911 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle\") pod \"e3036f17-599f-40b1-8a0f-37f64940d172\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.153940 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data\") pod \"e3036f17-599f-40b1-8a0f-37f64940d172\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.154066 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs\") pod \"e3036f17-599f-40b1-8a0f-37f64940d172\" (UID: \"e3036f17-599f-40b1-8a0f-37f64940d172\") " Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.154758 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs" (OuterVolumeSpecName: "logs") pod "e3036f17-599f-40b1-8a0f-37f64940d172" (UID: "e3036f17-599f-40b1-8a0f-37f64940d172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.155036 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3036f17-599f-40b1-8a0f-37f64940d172-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.157161 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h" (OuterVolumeSpecName: "kube-api-access-ppq5h") pod "e3036f17-599f-40b1-8a0f-37f64940d172" (UID: "e3036f17-599f-40b1-8a0f-37f64940d172"). InnerVolumeSpecName "kube-api-access-ppq5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.176594 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3036f17-599f-40b1-8a0f-37f64940d172" (UID: "e3036f17-599f-40b1-8a0f-37f64940d172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.179635 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data" (OuterVolumeSpecName: "config-data") pod "e3036f17-599f-40b1-8a0f-37f64940d172" (UID: "e3036f17-599f-40b1-8a0f-37f64940d172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.250246 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.258035 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppq5h\" (UniqueName: \"kubernetes.io/projected/e3036f17-599f-40b1-8a0f-37f64940d172-kube-api-access-ppq5h\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.258088 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.258099 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3036f17-599f-40b1-8a0f-37f64940d172-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.494583 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0114cbad-3939-4254-98c5-34a0b36b5ff1" path="/var/lib/kubelet/pods/0114cbad-3939-4254-98c5-34a0b36b5ff1/volumes" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.578040 4733 generic.go:334] "Generic (PLEG): container finished" podID="e3036f17-599f-40b1-8a0f-37f64940d172" containerID="37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7" exitCode=0 Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.578143 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.578159 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerDied","Data":"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7"} Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.578209 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3036f17-599f-40b1-8a0f-37f64940d172","Type":"ContainerDied","Data":"c921aee6346bc6588602d7f52d7aaefa708ceffd95855e196d838e6c4b29fa9b"} Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.578239 4733 scope.go:117] "RemoveContainer" containerID="37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.602198 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.605006 4733 scope.go:117] "RemoveContainer" containerID="feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.609313 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.622106 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:18 crc kubenswrapper[4733]: E1206 06:02:18.622687 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-api" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.622707 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-api" Dec 06 06:02:18 crc kubenswrapper[4733]: E1206 06:02:18.622750 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-log" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.622756 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-log" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.622969 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-log" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.622984 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" containerName="nova-api-api" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.624072 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.629850 4733 scope.go:117] "RemoveContainer" containerID="37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.630109 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.630178 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 06:02:18 crc kubenswrapper[4733]: E1206 06:02:18.630450 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7\": container with ID starting with 37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7 not found: ID does not exist" containerID="37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.630491 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7"} err="failed to get container status \"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7\": rpc error: code = NotFound desc = could not find container \"37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7\": container with ID starting with 37d4db8125873aa5fa0631566f4213623bcb883557f21a243160f3541b4cdcd7 not found: ID does not exist" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.630505 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.630512 4733 scope.go:117] "RemoveContainer" containerID="feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648" Dec 06 06:02:18 crc kubenswrapper[4733]: E1206 06:02:18.634235 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648\": container with ID starting with feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648 not found: ID does not exist" containerID="feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.634272 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648"} err="failed to get container status \"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648\": rpc error: code = NotFound desc = could not find container \"feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648\": container with ID starting with feeec468cc16fed5e4772c1d70ebefaf58331507c7497319ab59211d4cacb648 not found: ID does not exist" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.649877 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666292 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666357 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666388 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666452 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666551 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6vq\" (UniqueName: \"kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.666613 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.669728 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 06:02:18 crc kubenswrapper[4733]: W1206 06:02:18.676741 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1edd6e2c_20a9_4584_aa48_64021a2911d3.slice/crio-868290e4347e279082c503a5ea8346d00a5555fd9e14162cdc8fa47c55a97688 WatchSource:0}: Error finding container 868290e4347e279082c503a5ea8346d00a5555fd9e14162cdc8fa47c55a97688: Status 404 returned error can't find the container with id 868290e4347e279082c503a5ea8346d00a5555fd9e14162cdc8fa47c55a97688 Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.768564 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.768641 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.768700 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.768739 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.768855 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.769025 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6vq\" (UniqueName: \"kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.770537 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.775783 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.775946 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.776742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.778113 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.784408 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6vq\" (UniqueName: \"kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq\") pod \"nova-api-0\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " pod="openstack/nova-api-0" Dec 06 06:02:18 crc kubenswrapper[4733]: I1206 06:02:18.951068 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:19 crc kubenswrapper[4733]: I1206 06:02:19.373012 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:19 crc kubenswrapper[4733]: I1206 06:02:19.605462 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1edd6e2c-20a9-4584-aa48-64021a2911d3","Type":"ContainerStarted","Data":"9171e18e657f37ae498188e2b45afa617bc0e9586e8df3f8b3b032a2edf855ff"} Dec 06 06:02:19 crc kubenswrapper[4733]: I1206 06:02:19.605525 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1edd6e2c-20a9-4584-aa48-64021a2911d3","Type":"ContainerStarted","Data":"868290e4347e279082c503a5ea8346d00a5555fd9e14162cdc8fa47c55a97688"} Dec 06 06:02:19 crc kubenswrapper[4733]: I1206 06:02:19.609717 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerStarted","Data":"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde"} Dec 06 06:02:19 crc kubenswrapper[4733]: I1206 06:02:19.609759 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerStarted","Data":"88496797a4c545cd4e18ec07b96e77d01e3395bfc47334d15823fc016b782020"} Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.100121 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.100588 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.122883 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.148740 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.495855 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3036f17-599f-40b1-8a0f-37f64940d172" path="/var/lib/kubelet/pods/e3036f17-599f-40b1-8a0f-37f64940d172/volumes" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.619744 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1edd6e2c-20a9-4584-aa48-64021a2911d3","Type":"ContainerStarted","Data":"908d6424387efa52d6ba8da66d12336cf1909b7d8a9cf3901d253710a74dfa0c"} Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.623722 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerStarted","Data":"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d"} Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.648447 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6484338579999998 podStartE2EDuration="2.648433858s" podCreationTimestamp="2025-12-06 06:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:20.641866706 +0000 UTC m=+1124.507077816" watchObservedRunningTime="2025-12-06 06:02:20.648433858 +0000 UTC m=+1124.513644968" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.649757 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.782425 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q8qbf"] Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.783912 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.786086 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.786117 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.788596 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8qbf"] Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.912415 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.912867 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vskr\" (UniqueName: \"kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.912995 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:20 crc kubenswrapper[4733]: I1206 06:02:20.913059 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.015081 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.015166 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vskr\" (UniqueName: \"kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.015276 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.015361 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.021438 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.021490 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.022047 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.031903 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vskr\" (UniqueName: \"kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr\") pod \"nova-cell1-cell-mapping-q8qbf\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.100595 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.124492 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.125200 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.538321 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8qbf"] Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.640293 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8qbf" event={"ID":"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0","Type":"ContainerStarted","Data":"c301bbdf75b2fb91defebd2bdb2051e5b67bfa32680e16570b27f5fb2cd6af0c"} Dec 06 06:02:21 crc kubenswrapper[4733]: I1206 06:02:21.644196 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1edd6e2c-20a9-4584-aa48-64021a2911d3","Type":"ContainerStarted","Data":"0b4b52ea02e4741d0065d6caf6fe644357407f7519371c849910c7424eac1849"} Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.098521 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.164259 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.172629 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="dnsmasq-dns" containerID="cri-o://3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e" gracePeriod=10 Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.624452 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.663223 4733 generic.go:334] "Generic (PLEG): container finished" podID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerID="3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e" exitCode=0 Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.663338 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.663382 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" event={"ID":"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e","Type":"ContainerDied","Data":"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e"} Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.663420 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c5fc6955-c8bh2" event={"ID":"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e","Type":"ContainerDied","Data":"1820b01b75b8129525fd0c4e69fdc6537b8752f726296f061e0826c187d27bb2"} Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.663441 4733 scope.go:117] "RemoveContainer" containerID="3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.666174 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1edd6e2c-20a9-4584-aa48-64021a2911d3","Type":"ContainerStarted","Data":"a324b57b65a1d9e58f04d450c90513678b04ff07b471d87c52c06ff716fa405a"} Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.666324 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.667250 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8qbf" event={"ID":"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0","Type":"ContainerStarted","Data":"5754252d43db7758a87de8fa86da9214349415b1585fde5e89a95f9e0be4d7a3"} Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.682208 4733 scope.go:117] "RemoveContainer" containerID="64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.685574 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.927675705 podStartE2EDuration="5.685563253s" podCreationTimestamp="2025-12-06 06:02:17 +0000 UTC" firstStartedPulling="2025-12-06 06:02:18.67914272 +0000 UTC m=+1122.544353832" lastFinishedPulling="2025-12-06 06:02:22.437030269 +0000 UTC m=+1126.302241380" observedRunningTime="2025-12-06 06:02:22.683220708 +0000 UTC m=+1126.548431818" watchObservedRunningTime="2025-12-06 06:02:22.685563253 +0000 UTC m=+1126.550774364" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.713560 4733 scope.go:117] "RemoveContainer" containerID="3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e" Dec 06 06:02:22 crc kubenswrapper[4733]: E1206 06:02:22.713901 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e\": container with ID starting with 3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e not found: ID does not exist" containerID="3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.713930 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e"} err="failed to get container status \"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e\": rpc error: code = NotFound desc = could not find container \"3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e\": container with ID starting with 3a8f31727368ff95f4c7f0e09ce51fc68208753673fd5827a21a80ae627e3f0e not found: ID does not exist" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.713950 4733 scope.go:117] "RemoveContainer" containerID="64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1" Dec 06 06:02:22 crc kubenswrapper[4733]: E1206 06:02:22.714219 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1\": container with ID starting with 64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1 not found: ID does not exist" containerID="64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.714263 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1"} err="failed to get container status \"64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1\": rpc error: code = NotFound desc = could not find container \"64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1\": container with ID starting with 64639f5ac94ae897cc1135dcffe67c37939895f79eb9d858f79a3c9cf681d3d1 not found: ID does not exist" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx7gg\" (UniqueName: \"kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753292 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753457 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753607 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753650 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.753708 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc\") pod \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\" (UID: \"aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e\") " Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.764066 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg" (OuterVolumeSpecName: "kube-api-access-zx7gg") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "kube-api-access-zx7gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.801383 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.802230 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.805988 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config" (OuterVolumeSpecName: "config") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.807275 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.815432 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" (UID: "aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858436 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx7gg\" (UniqueName: \"kubernetes.io/projected/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-kube-api-access-zx7gg\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858475 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858488 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858502 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858511 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.858523 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.988048 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q8qbf" podStartSLOduration=2.988027459 podStartE2EDuration="2.988027459s" podCreationTimestamp="2025-12-06 06:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:22.705548621 +0000 UTC m=+1126.570759731" watchObservedRunningTime="2025-12-06 06:02:22.988027459 +0000 UTC m=+1126.853238570" Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.991298 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:02:22 crc kubenswrapper[4733]: I1206 06:02:22.998725 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c5fc6955-c8bh2"] Dec 06 06:02:24 crc kubenswrapper[4733]: I1206 06:02:24.495663 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" path="/var/lib/kubelet/pods/aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e/volumes" Dec 06 06:02:26 crc kubenswrapper[4733]: I1206 06:02:26.725658 4733 generic.go:334] "Generic (PLEG): container finished" podID="42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" containerID="5754252d43db7758a87de8fa86da9214349415b1585fde5e89a95f9e0be4d7a3" exitCode=0 Dec 06 06:02:26 crc kubenswrapper[4733]: I1206 06:02:26.725748 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8qbf" event={"ID":"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0","Type":"ContainerDied","Data":"5754252d43db7758a87de8fa86da9214349415b1585fde5e89a95f9e0be4d7a3"} Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.040485 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.175978 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle\") pod \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.176223 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data\") pod \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.176322 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts\") pod \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.176492 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vskr\" (UniqueName: \"kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr\") pod \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\" (UID: \"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0\") " Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.182986 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts" (OuterVolumeSpecName: "scripts") pod "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" (UID: "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.183049 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr" (OuterVolumeSpecName: "kube-api-access-2vskr") pod "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" (UID: "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0"). InnerVolumeSpecName "kube-api-access-2vskr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.202395 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" (UID: "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.202732 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data" (OuterVolumeSpecName: "config-data") pod "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" (UID: "42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.279288 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.279330 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.279343 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vskr\" (UniqueName: \"kubernetes.io/projected/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-kube-api-access-2vskr\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.279360 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.748357 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8qbf" event={"ID":"42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0","Type":"ContainerDied","Data":"c301bbdf75b2fb91defebd2bdb2051e5b67bfa32680e16570b27f5fb2cd6af0c"} Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.748400 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8qbf" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.748412 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c301bbdf75b2fb91defebd2bdb2051e5b67bfa32680e16570b27f5fb2cd6af0c" Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.937877 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.938400 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-log" containerID="cri-o://50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" gracePeriod=30 Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.938686 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-api" containerID="cri-o://1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" gracePeriod=30 Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.940998 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:28 crc kubenswrapper[4733]: I1206 06:02:28.941263 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4298b279-055f-4e06-812f-358be8714f9e" containerName="nova-scheduler-scheduler" containerID="cri-o://c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94" gracePeriod=30 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.036640 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.037395 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-log" containerID="cri-o://df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2" gracePeriod=30 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.037533 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-metadata" containerID="cri-o://392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af" gracePeriod=30 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.462596 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608373 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608431 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608485 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608558 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608582 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg6vq\" (UniqueName: \"kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.608645 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs\") pod \"00680eba-60f7-496d-8ae3-2e25b756feba\" (UID: \"00680eba-60f7-496d-8ae3-2e25b756feba\") " Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.611965 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs" (OuterVolumeSpecName: "logs") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.616537 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq" (OuterVolumeSpecName: "kube-api-access-kg6vq") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "kube-api-access-kg6vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.639192 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data" (OuterVolumeSpecName: "config-data") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.643476 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.658146 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.661153 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "00680eba-60f7-496d-8ae3-2e25b756feba" (UID: "00680eba-60f7-496d-8ae3-2e25b756feba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710370 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710400 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg6vq\" (UniqueName: \"kubernetes.io/projected/00680eba-60f7-496d-8ae3-2e25b756feba-kube-api-access-kg6vq\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710417 4733 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710427 4733 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710437 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00680eba-60f7-496d-8ae3-2e25b756feba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.710446 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00680eba-60f7-496d-8ae3-2e25b756feba-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.759838 4733 generic.go:334] "Generic (PLEG): container finished" podID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerID="df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2" exitCode=143 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.759902 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerDied","Data":"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2"} Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.761932 4733 generic.go:334] "Generic (PLEG): container finished" podID="00680eba-60f7-496d-8ae3-2e25b756feba" containerID="1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" exitCode=0 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.761964 4733 generic.go:334] "Generic (PLEG): container finished" podID="00680eba-60f7-496d-8ae3-2e25b756feba" containerID="50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" exitCode=143 Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.761983 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerDied","Data":"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d"} Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.762006 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerDied","Data":"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde"} Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.762021 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"00680eba-60f7-496d-8ae3-2e25b756feba","Type":"ContainerDied","Data":"88496797a4c545cd4e18ec07b96e77d01e3395bfc47334d15823fc016b782020"} Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.762044 4733 scope.go:117] "RemoveContainer" containerID="1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.762190 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.793538 4733 scope.go:117] "RemoveContainer" containerID="50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.799190 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.814236 4733 scope.go:117] "RemoveContainer" containerID="1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.814974 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d\": container with ID starting with 1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d not found: ID does not exist" containerID="1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.815036 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d"} err="failed to get container status \"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d\": rpc error: code = NotFound desc = could not find container \"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d\": container with ID starting with 1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d not found: ID does not exist" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.815062 4733 scope.go:117] "RemoveContainer" containerID="50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.816223 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.817014 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde\": container with ID starting with 50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde not found: ID does not exist" containerID="50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.817063 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde"} err="failed to get container status \"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde\": rpc error: code = NotFound desc = could not find container \"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde\": container with ID starting with 50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde not found: ID does not exist" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.817094 4733 scope.go:117] "RemoveContainer" containerID="1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.817940 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d"} err="failed to get container status \"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d\": rpc error: code = NotFound desc = could not find container \"1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d\": container with ID starting with 1c9ccf381f8ca69391e95443b69e1325e9b61e84f2533bbdcf22c42ceb7c970d not found: ID does not exist" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.820021 4733 scope.go:117] "RemoveContainer" containerID="50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.825482 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde"} err="failed to get container status \"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde\": rpc error: code = NotFound desc = could not find container \"50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde\": container with ID starting with 50b411fb92ea75fc3517108c65219acef044ecb54c8b798ff25a8fd4eea6acde not found: ID does not exist" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.829524 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.830063 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-api" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830092 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-api" Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.830101 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="dnsmasq-dns" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830108 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="dnsmasq-dns" Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.830118 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-log" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830124 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-log" Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.830162 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" containerName="nova-manage" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830167 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" containerName="nova-manage" Dec 06 06:02:29 crc kubenswrapper[4733]: E1206 06:02:29.830175 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="init" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830181 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="init" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830417 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5c1a3a-d4d2-4f6b-b424-56c23bd5994e" containerName="dnsmasq-dns" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830437 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" containerName="nova-manage" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830450 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-log" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.830475 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" containerName="nova-api-api" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.831597 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.834681 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.834851 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.835019 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 06:02:29 crc kubenswrapper[4733]: I1206 06:02:29.837633 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.016944 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.017112 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f9bd130-962d-4315-b471-987273048485-logs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.017174 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.017204 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tpj\" (UniqueName: \"kubernetes.io/projected/4f9bd130-962d-4315-b471-987273048485-kube-api-access-n8tpj\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.017235 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-config-data\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.017267 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119253 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f9bd130-962d-4315-b471-987273048485-logs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119391 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tpj\" (UniqueName: \"kubernetes.io/projected/4f9bd130-962d-4315-b471-987273048485-kube-api-access-n8tpj\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119483 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-config-data\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119525 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119649 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.119760 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f9bd130-962d-4315-b471-987273048485-logs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.125008 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-config-data\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.129722 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.130056 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.145295 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9bd130-962d-4315-b471-987273048485-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.148755 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tpj\" (UniqueName: \"kubernetes.io/projected/4f9bd130-962d-4315-b471-987273048485-kube-api-access-n8tpj\") pod \"nova-api-0\" (UID: \"4f9bd130-962d-4315-b471-987273048485\") " pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.446916 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.481414 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.495949 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00680eba-60f7-496d-8ae3-2e25b756feba" path="/var/lib/kubelet/pods/00680eba-60f7-496d-8ae3-2e25b756feba/volumes" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.630326 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data\") pod \"4298b279-055f-4e06-812f-358be8714f9e\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.630493 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle\") pod \"4298b279-055f-4e06-812f-358be8714f9e\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.630687 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-469j8\" (UniqueName: \"kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8\") pod \"4298b279-055f-4e06-812f-358be8714f9e\" (UID: \"4298b279-055f-4e06-812f-358be8714f9e\") " Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.637563 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8" (OuterVolumeSpecName: "kube-api-access-469j8") pod "4298b279-055f-4e06-812f-358be8714f9e" (UID: "4298b279-055f-4e06-812f-358be8714f9e"). InnerVolumeSpecName "kube-api-access-469j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.659603 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data" (OuterVolumeSpecName: "config-data") pod "4298b279-055f-4e06-812f-358be8714f9e" (UID: "4298b279-055f-4e06-812f-358be8714f9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.668849 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4298b279-055f-4e06-812f-358be8714f9e" (UID: "4298b279-055f-4e06-812f-358be8714f9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.734908 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.734948 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-469j8\" (UniqueName: \"kubernetes.io/projected/4298b279-055f-4e06-812f-358be8714f9e-kube-api-access-469j8\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.734963 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4298b279-055f-4e06-812f-358be8714f9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.777418 4733 generic.go:334] "Generic (PLEG): container finished" podID="4298b279-055f-4e06-812f-358be8714f9e" containerID="c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94" exitCode=0 Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.777481 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4298b279-055f-4e06-812f-358be8714f9e","Type":"ContainerDied","Data":"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94"} Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.777522 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4298b279-055f-4e06-812f-358be8714f9e","Type":"ContainerDied","Data":"785c1727de17f5195f1a37c97e36ac04d60e405666263028014d5a0762bcd9a3"} Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.777539 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.777558 4733 scope.go:117] "RemoveContainer" containerID="c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.808397 4733 scope.go:117] "RemoveContainer" containerID="c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94" Dec 06 06:02:30 crc kubenswrapper[4733]: E1206 06:02:30.808773 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94\": container with ID starting with c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94 not found: ID does not exist" containerID="c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.808871 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94"} err="failed to get container status \"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94\": rpc error: code = NotFound desc = could not find container \"c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94\": container with ID starting with c6e211e752eeba04d1b16667e1fdb1f6acbceed07f668433a36a4e7c98d23c94 not found: ID does not exist" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.810739 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.816199 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.824872 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: E1206 06:02:30.825261 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4298b279-055f-4e06-812f-358be8714f9e" containerName="nova-scheduler-scheduler" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.825281 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="4298b279-055f-4e06-812f-358be8714f9e" containerName="nova-scheduler-scheduler" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.825475 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="4298b279-055f-4e06-812f-358be8714f9e" containerName="nova-scheduler-scheduler" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.826050 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.828812 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.837208 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: W1206 06:02:30.871009 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9bd130_962d_4315_b471_987273048485.slice/crio-7525623a87e0c11f0e8bc6aa92df95cae035d78af0e71adbd7439bcaaef97e22 WatchSource:0}: Error finding container 7525623a87e0c11f0e8bc6aa92df95cae035d78af0e71adbd7439bcaaef97e22: Status 404 returned error can't find the container with id 7525623a87e0c11f0e8bc6aa92df95cae035d78af0e71adbd7439bcaaef97e22 Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.871351 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.938830 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.939169 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc47r\" (UniqueName: \"kubernetes.io/projected/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-kube-api-access-xc47r\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:30 crc kubenswrapper[4733]: I1206 06:02:30.939211 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.040747 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.040924 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc47r\" (UniqueName: \"kubernetes.io/projected/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-kube-api-access-xc47r\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.040957 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.044428 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.050759 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.059738 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc47r\" (UniqueName: \"kubernetes.io/projected/b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5-kube-api-access-xc47r\") pod \"nova-scheduler-0\" (UID: \"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5\") " pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.139162 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 06:02:31 crc kubenswrapper[4733]: W1206 06:02:31.584942 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1ab4bcf_72c0_4aa4_8773_8cedd25ea6d5.slice/crio-b0ac8c17182ccca8891d6eb9185a97c32fb377b114ec25ca12a4ea3636f4f934 WatchSource:0}: Error finding container b0ac8c17182ccca8891d6eb9185a97c32fb377b114ec25ca12a4ea3636f4f934: Status 404 returned error can't find the container with id b0ac8c17182ccca8891d6eb9185a97c32fb377b114ec25ca12a4ea3636f4f934 Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.592926 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.794790 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5","Type":"ContainerStarted","Data":"872743d243fa646ced870a93722d8c501eb4f7b550d74e1c57813362e1281d1f"} Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.795007 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5","Type":"ContainerStarted","Data":"b0ac8c17182ccca8891d6eb9185a97c32fb377b114ec25ca12a4ea3636f4f934"} Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.798719 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f9bd130-962d-4315-b471-987273048485","Type":"ContainerStarted","Data":"942a7b09aafc61ca23bb64a59f10cbf4d9c7fc0c2ab2b0296f684727a4f85a84"} Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.798772 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f9bd130-962d-4315-b471-987273048485","Type":"ContainerStarted","Data":"8b35f4fcea623a53ba2a4f1330f553d10ce55198be603900275f1df7e2197ccd"} Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.798790 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f9bd130-962d-4315-b471-987273048485","Type":"ContainerStarted","Data":"7525623a87e0c11f0e8bc6aa92df95cae035d78af0e71adbd7439bcaaef97e22"} Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.815378 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.815354128 podStartE2EDuration="1.815354128s" podCreationTimestamp="2025-12-06 06:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:31.808477473 +0000 UTC m=+1135.673688584" watchObservedRunningTime="2025-12-06 06:02:31.815354128 +0000 UTC m=+1135.680565238" Dec 06 06:02:31 crc kubenswrapper[4733]: I1206 06:02:31.833353 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.833342009 podStartE2EDuration="2.833342009s" podCreationTimestamp="2025-12-06 06:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:31.822995479 +0000 UTC m=+1135.688206590" watchObservedRunningTime="2025-12-06 06:02:31.833342009 +0000 UTC m=+1135.698553120" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.496338 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4298b279-055f-4e06-812f-358be8714f9e" path="/var/lib/kubelet/pods/4298b279-055f-4e06-812f-358be8714f9e/volumes" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.549890 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.679099 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2zgl\" (UniqueName: \"kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl\") pod \"13c8ded0-d473-485a-99c5-3cebe4c806af\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.679331 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs\") pod \"13c8ded0-d473-485a-99c5-3cebe4c806af\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.679391 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data\") pod \"13c8ded0-d473-485a-99c5-3cebe4c806af\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.679548 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle\") pod \"13c8ded0-d473-485a-99c5-3cebe4c806af\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.679586 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs\") pod \"13c8ded0-d473-485a-99c5-3cebe4c806af\" (UID: \"13c8ded0-d473-485a-99c5-3cebe4c806af\") " Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.680093 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs" (OuterVolumeSpecName: "logs") pod "13c8ded0-d473-485a-99c5-3cebe4c806af" (UID: "13c8ded0-d473-485a-99c5-3cebe4c806af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.681142 4733 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c8ded0-d473-485a-99c5-3cebe4c806af-logs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.686189 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl" (OuterVolumeSpecName: "kube-api-access-x2zgl") pod "13c8ded0-d473-485a-99c5-3cebe4c806af" (UID: "13c8ded0-d473-485a-99c5-3cebe4c806af"). InnerVolumeSpecName "kube-api-access-x2zgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.707023 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c8ded0-d473-485a-99c5-3cebe4c806af" (UID: "13c8ded0-d473-485a-99c5-3cebe4c806af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.725482 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data" (OuterVolumeSpecName: "config-data") pod "13c8ded0-d473-485a-99c5-3cebe4c806af" (UID: "13c8ded0-d473-485a-99c5-3cebe4c806af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.742540 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "13c8ded0-d473-485a-99c5-3cebe4c806af" (UID: "13c8ded0-d473-485a-99c5-3cebe4c806af"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.783952 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.783986 4733 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.784001 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2zgl\" (UniqueName: \"kubernetes.io/projected/13c8ded0-d473-485a-99c5-3cebe4c806af-kube-api-access-x2zgl\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.784012 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c8ded0-d473-485a-99c5-3cebe4c806af-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.812988 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.813030 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerDied","Data":"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af"} Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.813111 4733 scope.go:117] "RemoveContainer" containerID="392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.812878 4733 generic.go:334] "Generic (PLEG): container finished" podID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerID="392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af" exitCode=0 Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.818493 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13c8ded0-d473-485a-99c5-3cebe4c806af","Type":"ContainerDied","Data":"ff128d04b5ae285032893282db2db7971d49809fb95bddf4525a19cb8061d652"} Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.860461 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.868958 4733 scope.go:117] "RemoveContainer" containerID="df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.879092 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.889379 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:32 crc kubenswrapper[4733]: E1206 06:02:32.889894 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-log" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.889915 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-log" Dec 06 06:02:32 crc kubenswrapper[4733]: E1206 06:02:32.889951 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-metadata" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.889958 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-metadata" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.890172 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-log" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.890196 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" containerName="nova-metadata-metadata" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.890752 4733 scope.go:117] "RemoveContainer" containerID="392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.891217 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.893748 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.893905 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 06:02:32 crc kubenswrapper[4733]: E1206 06:02:32.894847 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af\": container with ID starting with 392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af not found: ID does not exist" containerID="392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.894880 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af"} err="failed to get container status \"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af\": rpc error: code = NotFound desc = could not find container \"392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af\": container with ID starting with 392946d6f3be3a8d63944354f322f376dd98ae90d3b18278d2f3c0684b6224af not found: ID does not exist" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.894903 4733 scope.go:117] "RemoveContainer" containerID="df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2" Dec 06 06:02:32 crc kubenswrapper[4733]: E1206 06:02:32.895269 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2\": container with ID starting with df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2 not found: ID does not exist" containerID="df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.895320 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2"} err="failed to get container status \"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2\": rpc error: code = NotFound desc = could not find container \"df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2\": container with ID starting with df4842c62f90119ae4d5f9884880a3912b02dbe0f6b8676f4558e03cd60e73e2 not found: ID does not exist" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.897784 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.989397 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.989502 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-config-data\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.989617 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjz9p\" (UniqueName: \"kubernetes.io/projected/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-kube-api-access-hjz9p\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.989699 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-logs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:32 crc kubenswrapper[4733]: I1206 06:02:32.989721 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.091785 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.091848 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-config-data\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.091890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjz9p\" (UniqueName: \"kubernetes.io/projected/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-kube-api-access-hjz9p\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.091938 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-logs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.091959 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.092527 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-logs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.095381 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-config-data\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.095619 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.098058 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.108396 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjz9p\" (UniqueName: \"kubernetes.io/projected/8731fbb5-bb48-4c17-9ab9-6a5584868dc2-kube-api-access-hjz9p\") pod \"nova-metadata-0\" (UID: \"8731fbb5-bb48-4c17-9ab9-6a5584868dc2\") " pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.211110 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.639156 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 06:02:33 crc kubenswrapper[4733]: W1206 06:02:33.639632 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8731fbb5_bb48_4c17_9ab9_6a5584868dc2.slice/crio-316af2e731773a0f5aca5dba7164820ff6ca66cd2ccb29db147b22c3422a91bd WatchSource:0}: Error finding container 316af2e731773a0f5aca5dba7164820ff6ca66cd2ccb29db147b22c3422a91bd: Status 404 returned error can't find the container with id 316af2e731773a0f5aca5dba7164820ff6ca66cd2ccb29db147b22c3422a91bd Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.830059 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8731fbb5-bb48-4c17-9ab9-6a5584868dc2","Type":"ContainerStarted","Data":"5bf9c0628dbca0ff2e5e9e89592dce60f2117258471f7dc5190e42daedbf1127"} Dec 06 06:02:33 crc kubenswrapper[4733]: I1206 06:02:33.830423 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8731fbb5-bb48-4c17-9ab9-6a5584868dc2","Type":"ContainerStarted","Data":"316af2e731773a0f5aca5dba7164820ff6ca66cd2ccb29db147b22c3422a91bd"} Dec 06 06:02:34 crc kubenswrapper[4733]: I1206 06:02:34.496428 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c8ded0-d473-485a-99c5-3cebe4c806af" path="/var/lib/kubelet/pods/13c8ded0-d473-485a-99c5-3cebe4c806af/volumes" Dec 06 06:02:34 crc kubenswrapper[4733]: I1206 06:02:34.844111 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8731fbb5-bb48-4c17-9ab9-6a5584868dc2","Type":"ContainerStarted","Data":"a1c83ec18014be644aca2109a22395e10baf0a33eae23c8569240d38faeb261a"} Dec 06 06:02:34 crc kubenswrapper[4733]: I1206 06:02:34.864909 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.864889718 podStartE2EDuration="2.864889718s" podCreationTimestamp="2025-12-06 06:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:02:34.861063973 +0000 UTC m=+1138.726275084" watchObservedRunningTime="2025-12-06 06:02:34.864889718 +0000 UTC m=+1138.730100829" Dec 06 06:02:36 crc kubenswrapper[4733]: I1206 06:02:36.139456 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 06:02:38 crc kubenswrapper[4733]: I1206 06:02:38.212643 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:02:38 crc kubenswrapper[4733]: I1206 06:02:38.212953 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 06:02:40 crc kubenswrapper[4733]: I1206 06:02:40.447717 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:02:40 crc kubenswrapper[4733]: I1206 06:02:40.448112 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 06:02:41 crc kubenswrapper[4733]: I1206 06:02:41.140276 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 06:02:41 crc kubenswrapper[4733]: I1206 06:02:41.162907 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 06:02:41 crc kubenswrapper[4733]: I1206 06:02:41.461443 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f9bd130-962d-4315-b471-987273048485" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:41 crc kubenswrapper[4733]: I1206 06:02:41.461478 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f9bd130-962d-4315-b471-987273048485" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:41 crc kubenswrapper[4733]: I1206 06:02:41.941768 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 06:02:43 crc kubenswrapper[4733]: I1206 06:02:43.212418 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 06:02:43 crc kubenswrapper[4733]: I1206 06:02:43.213445 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 06:02:44 crc kubenswrapper[4733]: I1206 06:02:44.225445 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8731fbb5-bb48-4c17-9ab9-6a5584868dc2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:44 crc kubenswrapper[4733]: I1206 06:02:44.225445 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8731fbb5-bb48-4c17-9ab9-6a5584868dc2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 06:02:48 crc kubenswrapper[4733]: I1206 06:02:48.258831 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 06:02:50 crc kubenswrapper[4733]: I1206 06:02:50.457040 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 06:02:50 crc kubenswrapper[4733]: I1206 06:02:50.457783 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 06:02:50 crc kubenswrapper[4733]: I1206 06:02:50.461804 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 06:02:50 crc kubenswrapper[4733]: I1206 06:02:50.464046 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 06:02:50 crc kubenswrapper[4733]: I1206 06:02:50.994865 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 06:02:51 crc kubenswrapper[4733]: I1206 06:02:51.003340 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 06:02:53 crc kubenswrapper[4733]: I1206 06:02:53.217513 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 06:02:53 crc kubenswrapper[4733]: I1206 06:02:53.217977 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 06:02:53 crc kubenswrapper[4733]: I1206 06:02:53.223399 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 06:02:53 crc kubenswrapper[4733]: I1206 06:02:53.223442 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 06:02:59 crc kubenswrapper[4733]: I1206 06:02:59.804069 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:00 crc kubenswrapper[4733]: I1206 06:03:00.639420 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:03 crc kubenswrapper[4733]: I1206 06:03:03.970944 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="rabbitmq" containerID="cri-o://ac133d55b98ee632f71e0c95233b98dd1f467f581b77c094eb880da4e03f23bd" gracePeriod=604796 Dec 06 06:03:04 crc kubenswrapper[4733]: I1206 06:03:04.856011 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="rabbitmq" containerID="cri-o://65f01921231ba7be95baeb61675f464398c91e40ec04d220476a54cbb32b8a55" gracePeriod=604796 Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.193383 4733 generic.go:334] "Generic (PLEG): container finished" podID="ba773bb2-77c5-4562-b8ba-53428904d503" containerID="ac133d55b98ee632f71e0c95233b98dd1f467f581b77c094eb880da4e03f23bd" exitCode=0 Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.193472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerDied","Data":"ac133d55b98ee632f71e0c95233b98dd1f467f581b77c094eb880da4e03f23bd"} Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.457755 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.491926 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.491971 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492024 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492056 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgzb2\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492091 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492206 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492252 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492289 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492412 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492436 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ba773bb2-77c5-4562-b8ba-53428904d503\" (UID: \"ba773bb2-77c5-4562-b8ba-53428904d503\") " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.492963 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.493199 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.494374 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.498514 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.498657 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info" (OuterVolumeSpecName: "pod-info") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.500531 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.511227 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2" (OuterVolumeSpecName: "kube-api-access-sgzb2") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "kube-api-access-sgzb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.516971 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data" (OuterVolumeSpecName: "config-data") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.520751 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.572963 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf" (OuterVolumeSpecName: "server-conf") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594570 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594600 4733 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba773bb2-77c5-4562-b8ba-53428904d503-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594614 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594624 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594632 4733 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594650 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgzb2\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-kube-api-access-sgzb2\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594659 4733 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba773bb2-77c5-4562-b8ba-53428904d503-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594683 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594693 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.594703 4733 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba773bb2-77c5-4562-b8ba-53428904d503-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.612105 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ba773bb2-77c5-4562-b8ba-53428904d503" (UID: "ba773bb2-77c5-4562-b8ba-53428904d503"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.630796 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.697279 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba773bb2-77c5-4562-b8ba-53428904d503-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:10 crc kubenswrapper[4733]: I1206 06:03:10.697328 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.206410 4733 generic.go:334] "Generic (PLEG): container finished" podID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerID="65f01921231ba7be95baeb61675f464398c91e40ec04d220476a54cbb32b8a55" exitCode=0 Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.206705 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerDied","Data":"65f01921231ba7be95baeb61675f464398c91e40ec04d220476a54cbb32b8a55"} Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.212427 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba773bb2-77c5-4562-b8ba-53428904d503","Type":"ContainerDied","Data":"189cb4994ff5de9f6317b61675993004b93907ca6e430d892a0d3379a2c4096b"} Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.212468 4733 scope.go:117] "RemoveContainer" containerID="ac133d55b98ee632f71e0c95233b98dd1f467f581b77c094eb880da4e03f23bd" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.212635 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.248842 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.254030 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.259271 4733 scope.go:117] "RemoveContainer" containerID="28d656946022d0e45d8ae7cd9d5210bbeea6770c3efa37b73f252df6528fed96" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.291753 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:11 crc kubenswrapper[4733]: E1206 06:03:11.292224 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="setup-container" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.292242 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="setup-container" Dec 06 06:03:11 crc kubenswrapper[4733]: E1206 06:03:11.292255 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.292261 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.292462 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.293519 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.298065 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.299109 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.299352 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.299506 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.300075 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.300219 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.300427 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2b962" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.303194 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.327834 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413171 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413255 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa056f3-b465-4a24-9eb5-9a5f5932749c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413474 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413514 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413535 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413601 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413720 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa056f3-b465-4a24-9eb5-9a5f5932749c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413782 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413801 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.413819 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswps\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-kube-api-access-wswps\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.514596 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.514748 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.514784 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.514893 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.514965 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515033 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnr5q\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515079 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515111 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515151 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515211 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515290 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515385 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info\") pod \"0d8769e1-2981-471a-bef8-ac4d193563cc\" (UID: \"0d8769e1-2981-471a-bef8-ac4d193563cc\") " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515897 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515929 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515937 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.515951 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wswps\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-kube-api-access-wswps\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516047 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516137 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa056f3-b465-4a24-9eb5-9a5f5932749c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516336 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516360 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516384 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516465 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516648 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa056f3-b465-4a24-9eb5-9a5f5932749c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516678 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516762 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.516781 4733 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.517469 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.518606 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.518817 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.521093 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.522364 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info" (OuterVolumeSpecName: "pod-info") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.523286 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.524083 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa056f3-b465-4a24-9eb5-9a5f5932749c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.524246 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.525913 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.534611 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswps\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-kube-api-access-wswps\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.535848 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q" (OuterVolumeSpecName: "kube-api-access-gnr5q") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "kube-api-access-gnr5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.536542 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.537398 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.537727 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa056f3-b465-4a24-9eb5-9a5f5932749c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.538000 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa056f3-b465-4a24-9eb5-9a5f5932749c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.553046 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa056f3-b465-4a24-9eb5-9a5f5932749c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.577109 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0fa056f3-b465-4a24-9eb5-9a5f5932749c\") " pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.578586 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data" (OuterVolumeSpecName: "config-data") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.591911 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf" (OuterVolumeSpecName: "server-conf") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619596 4733 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d8769e1-2981-471a-bef8-ac4d193563cc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619710 4733 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619806 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619876 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnr5q\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-kube-api-access-gnr5q\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619936 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.619999 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d8769e1-2981-471a-bef8-ac4d193563cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.620061 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.620117 4733 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d8769e1-2981-471a-bef8-ac4d193563cc-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.636857 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0d8769e1-2981-471a-bef8-ac4d193563cc" (UID: "0d8769e1-2981-471a-bef8-ac4d193563cc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.638373 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.673701 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.695284 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:11 crc kubenswrapper[4733]: E1206 06:03:11.695727 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.695746 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: E1206 06:03:11.695778 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="setup-container" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.695784 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="setup-container" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.695965 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" containerName="rabbitmq" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.696947 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.698886 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.715222 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.721934 4733 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d8769e1-2981-471a-bef8-ac4d193563cc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.721960 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825074 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825129 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825375 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62qc\" (UniqueName: \"kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825583 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825763 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825866 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.825924 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928139 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928224 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928260 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928340 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928365 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928413 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62qc\" (UniqueName: \"kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.928450 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929260 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929335 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929337 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929695 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929845 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.929925 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:11 crc kubenswrapper[4733]: I1206 06:03:11.948408 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62qc\" (UniqueName: \"kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc\") pod \"dnsmasq-dns-7dbd585f5f-xpx95\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.013081 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.076428 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 06:03:12 crc kubenswrapper[4733]: W1206 06:03:12.085451 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa056f3_b465_4a24_9eb5_9a5f5932749c.slice/crio-4b501c9614aa47c0ed5e54db88134470bbb2708d1e9a5ac99c4bccbc8d576822 WatchSource:0}: Error finding container 4b501c9614aa47c0ed5e54db88134470bbb2708d1e9a5ac99c4bccbc8d576822: Status 404 returned error can't find the container with id 4b501c9614aa47c0ed5e54db88134470bbb2708d1e9a5ac99c4bccbc8d576822 Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.227528 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d8769e1-2981-471a-bef8-ac4d193563cc","Type":"ContainerDied","Data":"7e1da629dfdc010176fafcb6a89d1e7dc9ea6192b88f1b40d0413e8a1e5b6352"} Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.227739 4733 scope.go:117] "RemoveContainer" containerID="65f01921231ba7be95baeb61675f464398c91e40ec04d220476a54cbb32b8a55" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.227847 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.235797 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa056f3-b465-4a24-9eb5-9a5f5932749c","Type":"ContainerStarted","Data":"4b501c9614aa47c0ed5e54db88134470bbb2708d1e9a5ac99c4bccbc8d576822"} Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.268898 4733 scope.go:117] "RemoveContainer" containerID="3414692aed66b4eeb2d86e147784525b258dc75c775f3e1178bcdad27a734b53" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.274841 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.301355 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.312748 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.314389 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.315943 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.316182 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.316569 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.316717 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.316973 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.317348 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nv9mn" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.317373 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.323977 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.438987 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439039 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439071 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439125 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439235 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439295 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439402 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439541 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k22c\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-kube-api-access-2k22c\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439616 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.439709 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.457007 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545621 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k22c\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-kube-api-access-2k22c\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545689 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545750 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545897 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545944 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.545980 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.546037 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.546060 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.546084 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.546108 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.546134 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.571766 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.578063 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.578720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.578808 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.580171 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.583951 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.584595 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.585348 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.592104 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.593922 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.595452 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k22c\" (UniqueName: \"kubernetes.io/projected/45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3-kube-api-access-2k22c\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.601041 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8769e1-2981-471a-bef8-ac4d193563cc" path="/var/lib/kubelet/pods/0d8769e1-2981-471a-bef8-ac4d193563cc/volumes" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.602412 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba773bb2-77c5-4562-b8ba-53428904d503" path="/var/lib/kubelet/pods/ba773bb2-77c5-4562-b8ba-53428904d503/volumes" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.637150 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:12 crc kubenswrapper[4733]: I1206 06:03:12.933844 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:13 crc kubenswrapper[4733]: I1206 06:03:13.249166 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa056f3-b465-4a24-9eb5-9a5f5932749c","Type":"ContainerStarted","Data":"21658e1edfbe562e4af0e0fad5e893b7ef04d22fa90beed74e05634650d3e5c7"} Dec 06 06:03:13 crc kubenswrapper[4733]: I1206 06:03:13.253543 4733 generic.go:334] "Generic (PLEG): container finished" podID="383128e5-756d-4021-bcd3-023b5fa02849" containerID="d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643" exitCode=0 Dec 06 06:03:13 crc kubenswrapper[4733]: I1206 06:03:13.253633 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" event={"ID":"383128e5-756d-4021-bcd3-023b5fa02849","Type":"ContainerDied","Data":"d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643"} Dec 06 06:03:13 crc kubenswrapper[4733]: I1206 06:03:13.253682 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" event={"ID":"383128e5-756d-4021-bcd3-023b5fa02849","Type":"ContainerStarted","Data":"c449a4733546d69eb9c486fff16023228983a440f07a3195daec08627441468f"} Dec 06 06:03:13 crc kubenswrapper[4733]: I1206 06:03:13.427858 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 06:03:14 crc kubenswrapper[4733]: I1206 06:03:14.268021 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" event={"ID":"383128e5-756d-4021-bcd3-023b5fa02849","Type":"ContainerStarted","Data":"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c"} Dec 06 06:03:14 crc kubenswrapper[4733]: I1206 06:03:14.268349 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:14 crc kubenswrapper[4733]: I1206 06:03:14.269930 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3","Type":"ContainerStarted","Data":"b39acfde1476b5594ff67410f13af975a75b9ea1c709895c748dbebdce70df42"} Dec 06 06:03:14 crc kubenswrapper[4733]: I1206 06:03:14.290420 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" podStartSLOduration=3.290399226 podStartE2EDuration="3.290399226s" podCreationTimestamp="2025-12-06 06:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:03:14.283367421 +0000 UTC m=+1178.148578533" watchObservedRunningTime="2025-12-06 06:03:14.290399226 +0000 UTC m=+1178.155610337" Dec 06 06:03:15 crc kubenswrapper[4733]: I1206 06:03:15.281870 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3","Type":"ContainerStarted","Data":"fe9c8dc110bb36900d6866fc6c748e83f23f59fb1fdb6a130d705f7ae249423a"} Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.015445 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.066173 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.066739 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="dnsmasq-dns" containerID="cri-o://d7a3bc0d32b5f0a4d09449c68899e79a0af2bd04e23bd38c3a503690f26937ac" gracePeriod=10 Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.097368 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: connect: connection refused" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.182143 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8bb97999-hr2v6"] Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.184779 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.221158 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8bb97999-hr2v6"] Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231065 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-svc\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231165 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-swift-storage-0\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231206 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-config\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231530 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-sb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231588 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231617 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqv88\" (UniqueName: \"kubernetes.io/projected/470393bd-fe7a-49f4-90f0-3625e4bdb497-kube-api-access-bqv88\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.231786 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-nb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-nb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-svc\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334685 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-swift-storage-0\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-config\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334913 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-sb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334938 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.334963 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqv88\" (UniqueName: \"kubernetes.io/projected/470393bd-fe7a-49f4-90f0-3625e4bdb497-kube-api-access-bqv88\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.335293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-nb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.335674 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-svc\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.335673 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-dns-swift-storage-0\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.336040 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-ovsdbserver-sb\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.336615 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.336705 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470393bd-fe7a-49f4-90f0-3625e4bdb497-config\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.350404 4733 generic.go:334] "Generic (PLEG): container finished" podID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerID="d7a3bc0d32b5f0a4d09449c68899e79a0af2bd04e23bd38c3a503690f26937ac" exitCode=0 Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.350448 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" event={"ID":"2f21ac21-5975-4205-9564-c2cfe24bd9ea","Type":"ContainerDied","Data":"d7a3bc0d32b5f0a4d09449c68899e79a0af2bd04e23bd38c3a503690f26937ac"} Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.355983 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqv88\" (UniqueName: \"kubernetes.io/projected/470393bd-fe7a-49f4-90f0-3625e4bdb497-kube-api-access-bqv88\") pod \"dnsmasq-dns-c8bb97999-hr2v6\" (UID: \"470393bd-fe7a-49f4-90f0-3625e4bdb497\") " pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.540394 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.549124 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.639909 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.640115 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.640204 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8697p\" (UniqueName: \"kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.640319 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.640344 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.640404 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config\") pod \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\" (UID: \"2f21ac21-5975-4205-9564-c2cfe24bd9ea\") " Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.649693 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p" (OuterVolumeSpecName: "kube-api-access-8697p") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "kube-api-access-8697p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.679321 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.681124 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.681385 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config" (OuterVolumeSpecName: "config") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.682247 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.682776 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f21ac21-5975-4205-9564-c2cfe24bd9ea" (UID: "2f21ac21-5975-4205-9564-c2cfe24bd9ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744251 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744277 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744290 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744299 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744324 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f21ac21-5975-4205-9564-c2cfe24bd9ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.744332 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8697p\" (UniqueName: \"kubernetes.io/projected/2f21ac21-5975-4205-9564-c2cfe24bd9ea-kube-api-access-8697p\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:22 crc kubenswrapper[4733]: I1206 06:03:22.967131 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8bb97999-hr2v6"] Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.366163 4733 generic.go:334] "Generic (PLEG): container finished" podID="470393bd-fe7a-49f4-90f0-3625e4bdb497" containerID="b44274c6bdad02a07f624012b799e2826dad645c7b102e551b8ec38ebacb1c15" exitCode=0 Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.366227 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" event={"ID":"470393bd-fe7a-49f4-90f0-3625e4bdb497","Type":"ContainerDied","Data":"b44274c6bdad02a07f624012b799e2826dad645c7b102e551b8ec38ebacb1c15"} Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.366586 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" event={"ID":"470393bd-fe7a-49f4-90f0-3625e4bdb497","Type":"ContainerStarted","Data":"e389229ec800051652e665513a1825431d687a8159ca3b65faf3f83a0282bd20"} Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.370297 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" event={"ID":"2f21ac21-5975-4205-9564-c2cfe24bd9ea","Type":"ContainerDied","Data":"2e8ad2760129bc7a602492378f192f5576e03b47ce23e11b1c7fa9ccd608fed7"} Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.370367 4733 scope.go:117] "RemoveContainer" containerID="d7a3bc0d32b5f0a4d09449c68899e79a0af2bd04e23bd38c3a503690f26937ac" Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.370412 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb95df969-pdwnz" Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.510434 4733 scope.go:117] "RemoveContainer" containerID="b54dae8a87702e821d4a6dcaa1c7f570637285800099616179f2c53594853b92" Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.542389 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:03:23 crc kubenswrapper[4733]: I1206 06:03:23.548717 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb95df969-pdwnz"] Dec 06 06:03:24 crc kubenswrapper[4733]: I1206 06:03:24.383742 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" event={"ID":"470393bd-fe7a-49f4-90f0-3625e4bdb497","Type":"ContainerStarted","Data":"7f8b25a11e9d727c55f3da999620ac75606775f3fb980cea605418f5e37f4c56"} Dec 06 06:03:24 crc kubenswrapper[4733]: I1206 06:03:24.384109 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:24 crc kubenswrapper[4733]: I1206 06:03:24.406115 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" podStartSLOduration=2.406089281 podStartE2EDuration="2.406089281s" podCreationTimestamp="2025-12-06 06:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:03:24.401861058 +0000 UTC m=+1188.267072170" watchObservedRunningTime="2025-12-06 06:03:24.406089281 +0000 UTC m=+1188.271300382" Dec 06 06:03:24 crc kubenswrapper[4733]: I1206 06:03:24.495576 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" path="/var/lib/kubelet/pods/2f21ac21-5975-4205-9564-c2cfe24bd9ea/volumes" Dec 06 06:03:32 crc kubenswrapper[4733]: I1206 06:03:32.541419 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c8bb97999-hr2v6" Dec 06 06:03:32 crc kubenswrapper[4733]: I1206 06:03:32.593605 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:32 crc kubenswrapper[4733]: I1206 06:03:32.594081 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="dnsmasq-dns" containerID="cri-o://af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c" gracePeriod=10 Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.080557 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.149933 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.149979 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.150159 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.150246 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.150384 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.150419 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.150448 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j62qc\" (UniqueName: \"kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc\") pod \"383128e5-756d-4021-bcd3-023b5fa02849\" (UID: \"383128e5-756d-4021-bcd3-023b5fa02849\") " Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.162575 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc" (OuterVolumeSpecName: "kube-api-access-j62qc") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "kube-api-access-j62qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.191820 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.195902 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.211131 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config" (OuterVolumeSpecName: "config") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.213119 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.213719 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.233728 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "383128e5-756d-4021-bcd3-023b5fa02849" (UID: "383128e5-756d-4021-bcd3-023b5fa02849"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.254908 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.254956 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.254975 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.254988 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.254999 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j62qc\" (UniqueName: \"kubernetes.io/projected/383128e5-756d-4021-bcd3-023b5fa02849-kube-api-access-j62qc\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.255015 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.255028 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/383128e5-756d-4021-bcd3-023b5fa02849-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.487268 4733 generic.go:334] "Generic (PLEG): container finished" podID="383128e5-756d-4021-bcd3-023b5fa02849" containerID="af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c" exitCode=0 Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.487341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" event={"ID":"383128e5-756d-4021-bcd3-023b5fa02849","Type":"ContainerDied","Data":"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c"} Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.488052 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" event={"ID":"383128e5-756d-4021-bcd3-023b5fa02849","Type":"ContainerDied","Data":"c449a4733546d69eb9c486fff16023228983a440f07a3195daec08627441468f"} Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.487396 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbd585f5f-xpx95" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.488106 4733 scope.go:117] "RemoveContainer" containerID="af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.517160 4733 scope.go:117] "RemoveContainer" containerID="d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.521374 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.529614 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dbd585f5f-xpx95"] Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.546989 4733 scope.go:117] "RemoveContainer" containerID="af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c" Dec 06 06:03:33 crc kubenswrapper[4733]: E1206 06:03:33.547351 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c\": container with ID starting with af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c not found: ID does not exist" containerID="af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.547400 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c"} err="failed to get container status \"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c\": rpc error: code = NotFound desc = could not find container \"af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c\": container with ID starting with af37cb8e66a9582a0bedf20a5924522e330c2d426e442320ac774f1b536aaf3c not found: ID does not exist" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.547425 4733 scope.go:117] "RemoveContainer" containerID="d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643" Dec 06 06:03:33 crc kubenswrapper[4733]: E1206 06:03:33.547734 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643\": container with ID starting with d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643 not found: ID does not exist" containerID="d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643" Dec 06 06:03:33 crc kubenswrapper[4733]: I1206 06:03:33.547750 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643"} err="failed to get container status \"d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643\": rpc error: code = NotFound desc = could not find container \"d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643\": container with ID starting with d5f27c53beab43e75a81b581ed2bdf16ca1fc6b889826a170c8d1e013c8b3643 not found: ID does not exist" Dec 06 06:03:34 crc kubenswrapper[4733]: I1206 06:03:34.497362 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383128e5-756d-4021-bcd3-023b5fa02849" path="/var/lib/kubelet/pods/383128e5-756d-4021-bcd3-023b5fa02849/volumes" Dec 06 06:03:42 crc kubenswrapper[4733]: I1206 06:03:42.989230 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:03:42 crc kubenswrapper[4733]: I1206 06:03:42.990025 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.619999 4733 generic.go:334] "Generic (PLEG): container finished" podID="0fa056f3-b465-4a24-9eb5-9a5f5932749c" containerID="21658e1edfbe562e4af0e0fad5e893b7ef04d22fa90beed74e05634650d3e5c7" exitCode=0 Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.620074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa056f3-b465-4a24-9eb5-9a5f5932749c","Type":"ContainerDied","Data":"21658e1edfbe562e4af0e0fad5e893b7ef04d22fa90beed74e05634650d3e5c7"} Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.691967 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg"] Dec 06 06:03:45 crc kubenswrapper[4733]: E1206 06:03:45.692575 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692592 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: E1206 06:03:45.692604 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="init" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692611 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="init" Dec 06 06:03:45 crc kubenswrapper[4733]: E1206 06:03:45.692621 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="init" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692628 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="init" Dec 06 06:03:45 crc kubenswrapper[4733]: E1206 06:03:45.692636 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692640 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692850 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f21ac21-5975-4205-9564-c2cfe24bd9ea" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.692866 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="383128e5-756d-4021-bcd3-023b5fa02849" containerName="dnsmasq-dns" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.693439 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.695219 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.695619 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.695763 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.695952 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.708465 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg"] Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.808373 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.808474 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.808568 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.808914 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdj8d\" (UniqueName: \"kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.911102 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.911176 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.911206 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.911284 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdj8d\" (UniqueName: \"kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.915750 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.915844 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.922405 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:45 crc kubenswrapper[4733]: I1206 06:03:45.925398 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdj8d\" (UniqueName: \"kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.078605 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.578633 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg"] Dec 06 06:03:46 crc kubenswrapper[4733]: W1206 06:03:46.586856 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode86f6dc2_6ce2_42b0_b4dd_e583d23f5f7c.slice/crio-28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2 WatchSource:0}: Error finding container 28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2: Status 404 returned error can't find the container with id 28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2 Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.590334 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.633826 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa056f3-b465-4a24-9eb5-9a5f5932749c","Type":"ContainerStarted","Data":"825a1bf94cb0dc76a1fb230fce048013c3eb36843484c60b5036c20e13ec1b2f"} Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.634054 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.635133 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" event={"ID":"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c","Type":"ContainerStarted","Data":"28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2"} Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.637297 4733 generic.go:334] "Generic (PLEG): container finished" podID="45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3" containerID="fe9c8dc110bb36900d6866fc6c748e83f23f59fb1fdb6a130d705f7ae249423a" exitCode=0 Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.637364 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3","Type":"ContainerDied","Data":"fe9c8dc110bb36900d6866fc6c748e83f23f59fb1fdb6a130d705f7ae249423a"} Dec 06 06:03:46 crc kubenswrapper[4733]: I1206 06:03:46.657961 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.657946493 podStartE2EDuration="35.657946493s" podCreationTimestamp="2025-12-06 06:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:03:46.652020046 +0000 UTC m=+1210.517231157" watchObservedRunningTime="2025-12-06 06:03:46.657946493 +0000 UTC m=+1210.523157594" Dec 06 06:03:47 crc kubenswrapper[4733]: I1206 06:03:47.659558 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3","Type":"ContainerStarted","Data":"8abcfba7c4b34e494bf3e332fdbdd466748c93ccbacbf54a954456c6fb2d4da3"} Dec 06 06:03:47 crc kubenswrapper[4733]: I1206 06:03:47.660217 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:03:47 crc kubenswrapper[4733]: I1206 06:03:47.706052 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.706032103 podStartE2EDuration="35.706032103s" podCreationTimestamp="2025-12-06 06:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:03:47.681560217 +0000 UTC m=+1211.546771328" watchObservedRunningTime="2025-12-06 06:03:47.706032103 +0000 UTC m=+1211.571243213" Dec 06 06:03:58 crc kubenswrapper[4733]: I1206 06:03:58.793128 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" event={"ID":"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c","Type":"ContainerStarted","Data":"f62c5d2b71c6498f784ce65af7f46ebe1a8f83d2141f999bfbbcb534c7916fc0"} Dec 06 06:03:58 crc kubenswrapper[4733]: I1206 06:03:58.812620 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" podStartSLOduration=2.277194627 podStartE2EDuration="13.812607047s" podCreationTimestamp="2025-12-06 06:03:45 +0000 UTC" firstStartedPulling="2025-12-06 06:03:46.590087445 +0000 UTC m=+1210.455298556" lastFinishedPulling="2025-12-06 06:03:58.125499875 +0000 UTC m=+1221.990710976" observedRunningTime="2025-12-06 06:03:58.809591846 +0000 UTC m=+1222.674802957" watchObservedRunningTime="2025-12-06 06:03:58.812607047 +0000 UTC m=+1222.677818158" Dec 06 06:04:01 crc kubenswrapper[4733]: I1206 06:04:01.677528 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 06:04:02 crc kubenswrapper[4733]: I1206 06:04:02.936543 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 06:04:09 crc kubenswrapper[4733]: I1206 06:04:09.911880 4733 generic.go:334] "Generic (PLEG): container finished" podID="e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" containerID="f62c5d2b71c6498f784ce65af7f46ebe1a8f83d2141f999bfbbcb534c7916fc0" exitCode=0 Dec 06 06:04:09 crc kubenswrapper[4733]: I1206 06:04:09.911990 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" event={"ID":"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c","Type":"ContainerDied","Data":"f62c5d2b71c6498f784ce65af7f46ebe1a8f83d2141f999bfbbcb534c7916fc0"} Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.286974 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.431497 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle\") pod \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.431656 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdj8d\" (UniqueName: \"kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d\") pod \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.432048 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory\") pod \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.432088 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key\") pod \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\" (UID: \"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c\") " Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.438461 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d" (OuterVolumeSpecName: "kube-api-access-gdj8d") pod "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" (UID: "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c"). InnerVolumeSpecName "kube-api-access-gdj8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.451348 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" (UID: "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.457850 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" (UID: "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.458878 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory" (OuterVolumeSpecName: "inventory") pod "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" (UID: "e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.535258 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.535281 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.535291 4733 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.535318 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdj8d\" (UniqueName: \"kubernetes.io/projected/e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c-kube-api-access-gdj8d\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.935544 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" event={"ID":"e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c","Type":"ContainerDied","Data":"28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2"} Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.935978 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28320a9e20d5b5f7fa366b1d182d8a2a35074d6588db741d879a57ac24b5a2a2" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.935611 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.998422 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2"] Dec 06 06:04:11 crc kubenswrapper[4733]: E1206 06:04:11.998849 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.998869 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.999069 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:11 crc kubenswrapper[4733]: I1206 06:04:11.999780 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.002067 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.002357 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.002529 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.005094 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.013003 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2"] Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.147710 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pn8\" (UniqueName: \"kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.148022 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.148426 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.250648 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.250799 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.250934 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pn8\" (UniqueName: \"kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.255464 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.256483 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.267107 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pn8\" (UniqueName: \"kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-769q2\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.315894 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.924738 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2"] Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.950862 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" event={"ID":"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a","Type":"ContainerStarted","Data":"f173e5418c399749bbd76d4af8b01654296cf12e111b4ebe5eff325fadda073f"} Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.990013 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:04:12 crc kubenswrapper[4733]: I1206 06:04:12.990062 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:04:13 crc kubenswrapper[4733]: I1206 06:04:13.966115 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" event={"ID":"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a","Type":"ContainerStarted","Data":"abd7ca8f4254256e605b4dc71be410d63d9407c87721332eb439015b02779ff3"} Dec 06 06:04:13 crc kubenswrapper[4733]: I1206 06:04:13.998532 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" podStartSLOduration=2.493121451 podStartE2EDuration="2.998482297s" podCreationTimestamp="2025-12-06 06:04:11 +0000 UTC" firstStartedPulling="2025-12-06 06:04:12.932904338 +0000 UTC m=+1236.798115449" lastFinishedPulling="2025-12-06 06:04:13.438265184 +0000 UTC m=+1237.303476295" observedRunningTime="2025-12-06 06:04:13.985615591 +0000 UTC m=+1237.850826702" watchObservedRunningTime="2025-12-06 06:04:13.998482297 +0000 UTC m=+1237.863693407" Dec 06 06:04:15 crc kubenswrapper[4733]: I1206 06:04:15.985475 4733 generic.go:334] "Generic (PLEG): container finished" podID="c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" containerID="abd7ca8f4254256e605b4dc71be410d63d9407c87721332eb439015b02779ff3" exitCode=0 Dec 06 06:04:15 crc kubenswrapper[4733]: I1206 06:04:15.985614 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" event={"ID":"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a","Type":"ContainerDied","Data":"abd7ca8f4254256e605b4dc71be410d63d9407c87721332eb439015b02779ff3"} Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.379266 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.458798 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key\") pod \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.459059 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4pn8\" (UniqueName: \"kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8\") pod \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.459249 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory\") pod \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\" (UID: \"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a\") " Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.467513 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8" (OuterVolumeSpecName: "kube-api-access-w4pn8") pod "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" (UID: "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a"). InnerVolumeSpecName "kube-api-access-w4pn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.489919 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory" (OuterVolumeSpecName: "inventory") pod "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" (UID: "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.492224 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" (UID: "c6d6b59f-d8e8-4f50-9100-a0c789e93a8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.562460 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.562613 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4pn8\" (UniqueName: \"kubernetes.io/projected/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-kube-api-access-w4pn8\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:17 crc kubenswrapper[4733]: I1206 06:04:17.562660 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6d6b59f-d8e8-4f50-9100-a0c789e93a8a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.006464 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" event={"ID":"c6d6b59f-d8e8-4f50-9100-a0c789e93a8a","Type":"ContainerDied","Data":"f173e5418c399749bbd76d4af8b01654296cf12e111b4ebe5eff325fadda073f"} Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.006538 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f173e5418c399749bbd76d4af8b01654296cf12e111b4ebe5eff325fadda073f" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.006572 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-769q2" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.078430 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv"] Dec 06 06:04:18 crc kubenswrapper[4733]: E1206 06:04:18.078980 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.079002 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.079218 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d6b59f-d8e8-4f50-9100-a0c789e93a8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.080036 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.083606 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.083878 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.084111 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.086555 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.087529 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv"] Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.173070 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.173598 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.173677 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gng\" (UniqueName: \"kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.173883 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.275621 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.276730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.276798 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gng\" (UniqueName: \"kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.276944 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.283020 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.283282 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.284609 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.293388 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gng\" (UniqueName: \"kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.397119 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:04:18 crc kubenswrapper[4733]: I1206 06:04:18.880748 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv"] Dec 06 06:04:19 crc kubenswrapper[4733]: I1206 06:04:19.018812 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" event={"ID":"35843bd8-0d3b-485a-b88f-95933d4c559e","Type":"ContainerStarted","Data":"866d3a2a8e7b41082227ab54b738d6f13c0957611c9d50a9728bb0b13c514201"} Dec 06 06:04:20 crc kubenswrapper[4733]: I1206 06:04:20.042485 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" event={"ID":"35843bd8-0d3b-485a-b88f-95933d4c559e","Type":"ContainerStarted","Data":"6afbabc455d499a46f342ef03d760be379b3ad896e2a5eb540852829f5afd7fc"} Dec 06 06:04:20 crc kubenswrapper[4733]: I1206 06:04:20.063974 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" podStartSLOduration=1.5657336609999999 podStartE2EDuration="2.063953077s" podCreationTimestamp="2025-12-06 06:04:18 +0000 UTC" firstStartedPulling="2025-12-06 06:04:18.887510618 +0000 UTC m=+1242.752721729" lastFinishedPulling="2025-12-06 06:04:19.385730034 +0000 UTC m=+1243.250941145" observedRunningTime="2025-12-06 06:04:20.056733962 +0000 UTC m=+1243.921945073" watchObservedRunningTime="2025-12-06 06:04:20.063953077 +0000 UTC m=+1243.929164189" Dec 06 06:04:42 crc kubenswrapper[4733]: I1206 06:04:42.989157 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:04:42 crc kubenswrapper[4733]: I1206 06:04:42.989697 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:04:42 crc kubenswrapper[4733]: I1206 06:04:42.989737 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:04:42 crc kubenswrapper[4733]: I1206 06:04:42.990487 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:04:42 crc kubenswrapper[4733]: I1206 06:04:42.990547 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298" gracePeriod=600 Dec 06 06:04:43 crc kubenswrapper[4733]: I1206 06:04:43.283403 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298" exitCode=0 Dec 06 06:04:43 crc kubenswrapper[4733]: I1206 06:04:43.283487 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298"} Dec 06 06:04:43 crc kubenswrapper[4733]: I1206 06:04:43.283852 4733 scope.go:117] "RemoveContainer" containerID="6cf0b6c52f78a1f3c9cd0937561802a5aad13c9f84f0305358100261c2849c9f" Dec 06 06:04:44 crc kubenswrapper[4733]: I1206 06:04:44.295264 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9"} Dec 06 06:05:48 crc kubenswrapper[4733]: I1206 06:05:48.408855 4733 scope.go:117] "RemoveContainer" containerID="9fa56c693dc430d0748d3ef5f09bf1591d19ecf2d60884842978547bd6d8fe4d" Dec 06 06:05:48 crc kubenswrapper[4733]: I1206 06:05:48.434469 4733 scope.go:117] "RemoveContainer" containerID="302ef6fc16dcc0333d1e457e53a8362b71103d268db7749bef2db7459679525d" Dec 06 06:05:48 crc kubenswrapper[4733]: I1206 06:05:48.471247 4733 scope.go:117] "RemoveContainer" containerID="1d2f948c405e31df4b5efe4581472878fcb2ebb353a87bde24b07372abe5927e" Dec 06 06:07:12 crc kubenswrapper[4733]: I1206 06:07:12.989165 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:07:12 crc kubenswrapper[4733]: I1206 06:07:12.991005 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:07:26 crc kubenswrapper[4733]: I1206 06:07:26.849467 4733 generic.go:334] "Generic (PLEG): container finished" podID="35843bd8-0d3b-485a-b88f-95933d4c559e" containerID="6afbabc455d499a46f342ef03d760be379b3ad896e2a5eb540852829f5afd7fc" exitCode=0 Dec 06 06:07:26 crc kubenswrapper[4733]: I1206 06:07:26.849541 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" event={"ID":"35843bd8-0d3b-485a-b88f-95933d4c559e","Type":"ContainerDied","Data":"6afbabc455d499a46f342ef03d760be379b3ad896e2a5eb540852829f5afd7fc"} Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.224713 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.379535 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gng\" (UniqueName: \"kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng\") pod \"35843bd8-0d3b-485a-b88f-95933d4c559e\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.379599 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key\") pod \"35843bd8-0d3b-485a-b88f-95933d4c559e\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.379681 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory\") pod \"35843bd8-0d3b-485a-b88f-95933d4c559e\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.379787 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle\") pod \"35843bd8-0d3b-485a-b88f-95933d4c559e\" (UID: \"35843bd8-0d3b-485a-b88f-95933d4c559e\") " Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.385914 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng" (OuterVolumeSpecName: "kube-api-access-27gng") pod "35843bd8-0d3b-485a-b88f-95933d4c559e" (UID: "35843bd8-0d3b-485a-b88f-95933d4c559e"). InnerVolumeSpecName "kube-api-access-27gng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.387042 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "35843bd8-0d3b-485a-b88f-95933d4c559e" (UID: "35843bd8-0d3b-485a-b88f-95933d4c559e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.407160 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35843bd8-0d3b-485a-b88f-95933d4c559e" (UID: "35843bd8-0d3b-485a-b88f-95933d4c559e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.409907 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory" (OuterVolumeSpecName: "inventory") pod "35843bd8-0d3b-485a-b88f-95933d4c559e" (UID: "35843bd8-0d3b-485a-b88f-95933d4c559e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.482145 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gng\" (UniqueName: \"kubernetes.io/projected/35843bd8-0d3b-485a-b88f-95933d4c559e-kube-api-access-27gng\") on node \"crc\" DevicePath \"\"" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.482182 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.482195 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.482208 4733 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35843bd8-0d3b-485a-b88f-95933d4c559e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.872991 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" event={"ID":"35843bd8-0d3b-485a-b88f-95933d4c559e","Type":"ContainerDied","Data":"866d3a2a8e7b41082227ab54b738d6f13c0957611c9d50a9728bb0b13c514201"} Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.873058 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866d3a2a8e7b41082227ab54b738d6f13c0957611c9d50a9728bb0b13c514201" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.873485 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.944889 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk"] Dec 06 06:07:28 crc kubenswrapper[4733]: E1206 06:07:28.945448 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35843bd8-0d3b-485a-b88f-95933d4c559e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.945471 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="35843bd8-0d3b-485a-b88f-95933d4c559e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.945728 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="35843bd8-0d3b-485a-b88f-95933d4c559e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.946547 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.949597 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.949894 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.950379 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.950419 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.963739 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk"] Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.993445 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.993724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plg46\" (UniqueName: \"kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:28 crc kubenswrapper[4733]: I1206 06:07:28.993797 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.095068 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plg46\" (UniqueName: \"kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.095139 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.095226 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.102318 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.104534 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.116726 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plg46\" (UniqueName: \"kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.262351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.734204 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk"] Dec 06 06:07:29 crc kubenswrapper[4733]: I1206 06:07:29.884909 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" event={"ID":"833ce9dd-3791-4da1-9f16-fb8db6d4c205","Type":"ContainerStarted","Data":"16081a58243206ba6a96b010751b6b0d7771a24c8e51eff32123511bd60b4821"} Dec 06 06:07:30 crc kubenswrapper[4733]: I1206 06:07:30.898395 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" event={"ID":"833ce9dd-3791-4da1-9f16-fb8db6d4c205","Type":"ContainerStarted","Data":"4e4984f24eec946ab22cc61278c48ebcb9636ddc7265373c10b7cde7d0f41b5a"} Dec 06 06:07:30 crc kubenswrapper[4733]: I1206 06:07:30.928551 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" podStartSLOduration=2.38905405 podStartE2EDuration="2.928520732s" podCreationTimestamp="2025-12-06 06:07:28 +0000 UTC" firstStartedPulling="2025-12-06 06:07:29.741695259 +0000 UTC m=+1433.606906369" lastFinishedPulling="2025-12-06 06:07:30.28116194 +0000 UTC m=+1434.146373051" observedRunningTime="2025-12-06 06:07:30.918357642 +0000 UTC m=+1434.783568753" watchObservedRunningTime="2025-12-06 06:07:30.928520732 +0000 UTC m=+1434.793731843" Dec 06 06:07:42 crc kubenswrapper[4733]: I1206 06:07:42.989103 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:07:42 crc kubenswrapper[4733]: I1206 06:07:42.989810 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:08:12 crc kubenswrapper[4733]: I1206 06:08:12.989063 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:08:12 crc kubenswrapper[4733]: I1206 06:08:12.989804 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:08:12 crc kubenswrapper[4733]: I1206 06:08:12.989864 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:08:12 crc kubenswrapper[4733]: I1206 06:08:12.990895 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:08:12 crc kubenswrapper[4733]: I1206 06:08:12.990971 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9" gracePeriod=600 Dec 06 06:08:13 crc kubenswrapper[4733]: I1206 06:08:13.325981 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9" exitCode=0 Dec 06 06:08:13 crc kubenswrapper[4733]: I1206 06:08:13.326044 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9"} Dec 06 06:08:13 crc kubenswrapper[4733]: I1206 06:08:13.326278 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35"} Dec 06 06:08:13 crc kubenswrapper[4733]: I1206 06:08:13.326300 4733 scope.go:117] "RemoveContainer" containerID="8b9c1cf2e876683db3ffd82323d6fa24cf8a792c4bcd35bbf88bc00105165298" Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.041017 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-g4cdr"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.050761 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7xlbm"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.064564 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-153e-account-create-update-lxx5v"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.069549 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9d84-account-create-update-zfm9s"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.074367 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-77pt9"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.079121 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-cd13-account-create-update-g68t4"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.083848 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-g4cdr"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.088494 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-cd13-account-create-update-g68t4"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.093151 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7xlbm"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.097778 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9d84-account-create-update-zfm9s"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.102379 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-77pt9"] Dec 06 06:08:43 crc kubenswrapper[4733]: I1206 06:08:43.107339 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-153e-account-create-update-lxx5v"] Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.497280 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0701e468-3c5a-4812-af28-36d85baf6756" path="/var/lib/kubelet/pods/0701e468-3c5a-4812-af28-36d85baf6756/volumes" Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.498189 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b6231d-2829-46b8-b8cf-ce76dc4ad424" path="/var/lib/kubelet/pods/16b6231d-2829-46b8-b8cf-ce76dc4ad424/volumes" Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.498881 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d41a55-121b-462f-8c49-a014cbcd5cd5" path="/var/lib/kubelet/pods/68d41a55-121b-462f-8c49-a014cbcd5cd5/volumes" Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.499467 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91584c47-2491-4388-9ce4-e76d4ef92afd" path="/var/lib/kubelet/pods/91584c47-2491-4388-9ce4-e76d4ef92afd/volumes" Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.500535 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d" path="/var/lib/kubelet/pods/b2f6ce07-95b4-4dcc-8cc1-8847d0c70f9d/volumes" Dec 06 06:08:44 crc kubenswrapper[4733]: I1206 06:08:44.501107 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd259bcc-2452-4b18-9019-072e545719bf" path="/var/lib/kubelet/pods/fd259bcc-2452-4b18-9019-072e545719bf/volumes" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.610481 4733 scope.go:117] "RemoveContainer" containerID="1a1d9a7f0deb240eeeddb3c50498ef2202f3e7eba15a8889da39b3f5f0e7caf1" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.649985 4733 scope.go:117] "RemoveContainer" containerID="0ed8e17045aa28eae3f7b4f5e8eb3378c0358f36a4330b15166204d65a506bb4" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.691631 4733 scope.go:117] "RemoveContainer" containerID="8ce10c54a57e033ee52d5cd20070d37386e094dfadad2b56902a77110616ed37" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.710599 4733 scope.go:117] "RemoveContainer" containerID="3631646b12d0e5b83aca92579f0c2e946bd2012199d2d213583dcfa1bd32f6c5" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.738783 4733 scope.go:117] "RemoveContainer" containerID="876b8c19ec859d03cc4c3f857df5ac051f8f38f1f471c4bc1a45c894a6014705" Dec 06 06:08:48 crc kubenswrapper[4733]: I1206 06:08:48.770724 4733 scope.go:117] "RemoveContainer" containerID="3d6523b3c212b9ad17e44a8a34695fdc22830391265037fe6baf77256a0a27a3" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.046352 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8hnm2"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.057590 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-z9w82"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.072520 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f762-account-create-update-phdmt"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.078570 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-781a-account-create-update-9g84w"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.084012 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7gpvs"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.088951 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-73db-account-create-update-ft7zq"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.093996 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-781a-account-create-update-9g84w"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.098998 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-z9w82"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.103942 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f762-account-create-update-phdmt"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.108780 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7gpvs"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.113936 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-73db-account-create-update-ft7zq"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.119398 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8hnm2"] Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.497182 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c64b364-21c9-4fd9-a392-18b9ea6661fb" path="/var/lib/kubelet/pods/0c64b364-21c9-4fd9-a392-18b9ea6661fb/volumes" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.497881 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aaa5a5a-871e-4022-9fea-20d5541424bf" path="/var/lib/kubelet/pods/1aaa5a5a-871e-4022-9fea-20d5541424bf/volumes" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.498471 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a0a2a8-5536-47b7-9f7c-2eef7e453214" path="/var/lib/kubelet/pods/31a0a2a8-5536-47b7-9f7c-2eef7e453214/volumes" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.499087 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8894ddef-6b07-452a-83fd-5d6383f280c2" path="/var/lib/kubelet/pods/8894ddef-6b07-452a-83fd-5d6383f280c2/volumes" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.500145 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d928009b-3db5-492c-8d4e-375ca54b6f8b" path="/var/lib/kubelet/pods/d928009b-3db5-492c-8d4e-375ca54b6f8b/volumes" Dec 06 06:09:14 crc kubenswrapper[4733]: I1206 06:09:14.500721 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab6d968-0d66-4b87-898b-d17fb0741af2" path="/var/lib/kubelet/pods/dab6d968-0d66-4b87-898b-d17fb0741af2/volumes" Dec 06 06:09:23 crc kubenswrapper[4733]: I1206 06:09:23.033643 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vrwfc"] Dec 06 06:09:23 crc kubenswrapper[4733]: I1206 06:09:23.040856 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vrwfc"] Dec 06 06:09:24 crc kubenswrapper[4733]: I1206 06:09:24.495079 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1390ae4e-536b-4861-8591-fd8656976c12" path="/var/lib/kubelet/pods/1390ae4e-536b-4861-8591-fd8656976c12/volumes" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.657908 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.676834 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.679744 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.728704 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.728909 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.729127 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.830620 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.830757 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.830947 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.831323 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.831391 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:25 crc kubenswrapper[4733]: I1206 06:09:25.850402 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2\") pod \"redhat-marketplace-zfbp6\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:26 crc kubenswrapper[4733]: I1206 06:09:26.003827 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:26 crc kubenswrapper[4733]: I1206 06:09:26.450678 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:27 crc kubenswrapper[4733]: I1206 06:09:27.078868 4733 generic.go:334] "Generic (PLEG): container finished" podID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerID="a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53" exitCode=0 Dec 06 06:09:27 crc kubenswrapper[4733]: I1206 06:09:27.078966 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerDied","Data":"a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53"} Dec 06 06:09:27 crc kubenswrapper[4733]: I1206 06:09:27.079242 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerStarted","Data":"2bd9150547e5fa427f40b1c109c6261cef08a17502ba89332a007db01c1267b6"} Dec 06 06:09:27 crc kubenswrapper[4733]: I1206 06:09:27.082182 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:09:28 crc kubenswrapper[4733]: I1206 06:09:28.088409 4733 generic.go:334] "Generic (PLEG): container finished" podID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerID="0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634" exitCode=0 Dec 06 06:09:28 crc kubenswrapper[4733]: I1206 06:09:28.088472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerDied","Data":"0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634"} Dec 06 06:09:29 crc kubenswrapper[4733]: I1206 06:09:29.097225 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerStarted","Data":"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6"} Dec 06 06:09:29 crc kubenswrapper[4733]: I1206 06:09:29.113132 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zfbp6" podStartSLOduration=2.60745586 podStartE2EDuration="4.113115499s" podCreationTimestamp="2025-12-06 06:09:25 +0000 UTC" firstStartedPulling="2025-12-06 06:09:27.081949336 +0000 UTC m=+1550.947160448" lastFinishedPulling="2025-12-06 06:09:28.587608976 +0000 UTC m=+1552.452820087" observedRunningTime="2025-12-06 06:09:29.109688114 +0000 UTC m=+1552.974899225" watchObservedRunningTime="2025-12-06 06:09:29.113115499 +0000 UTC m=+1552.978326610" Dec 06 06:09:36 crc kubenswrapper[4733]: I1206 06:09:36.003923 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:36 crc kubenswrapper[4733]: I1206 06:09:36.004650 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:36 crc kubenswrapper[4733]: I1206 06:09:36.044826 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:36 crc kubenswrapper[4733]: I1206 06:09:36.221912 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:36 crc kubenswrapper[4733]: I1206 06:09:36.283492 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:37 crc kubenswrapper[4733]: I1206 06:09:37.038565 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zz9tl"] Dec 06 06:09:37 crc kubenswrapper[4733]: I1206 06:09:37.048145 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zz9tl"] Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.194079 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zfbp6" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="registry-server" containerID="cri-o://24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6" gracePeriod=2 Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.499767 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc8e93b6-7230-41f1-98f5-18b252d0d724" path="/var/lib/kubelet/pods/bc8e93b6-7230-41f1-98f5-18b252d0d724/volumes" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.627218 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.681612 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities\") pod \"6fe10064-23b6-4b35-bcff-1f41e1776921\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.681697 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content\") pod \"6fe10064-23b6-4b35-bcff-1f41e1776921\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.681811 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2\") pod \"6fe10064-23b6-4b35-bcff-1f41e1776921\" (UID: \"6fe10064-23b6-4b35-bcff-1f41e1776921\") " Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.682807 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities" (OuterVolumeSpecName: "utilities") pod "6fe10064-23b6-4b35-bcff-1f41e1776921" (UID: "6fe10064-23b6-4b35-bcff-1f41e1776921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.687727 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2" (OuterVolumeSpecName: "kube-api-access-8ssv2") pod "6fe10064-23b6-4b35-bcff-1f41e1776921" (UID: "6fe10064-23b6-4b35-bcff-1f41e1776921"). InnerVolumeSpecName "kube-api-access-8ssv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.696145 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fe10064-23b6-4b35-bcff-1f41e1776921" (UID: "6fe10064-23b6-4b35-bcff-1f41e1776921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.785918 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/6fe10064-23b6-4b35-bcff-1f41e1776921-kube-api-access-8ssv2\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.785952 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:38 crc kubenswrapper[4733]: I1206 06:09:38.785963 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe10064-23b6-4b35-bcff-1f41e1776921-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.031717 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cqh6m"] Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.039538 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cqh6m"] Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.206288 4733 generic.go:334] "Generic (PLEG): container finished" podID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerID="24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6" exitCode=0 Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.206342 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerDied","Data":"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6"} Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.206376 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfbp6" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.206984 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfbp6" event={"ID":"6fe10064-23b6-4b35-bcff-1f41e1776921","Type":"ContainerDied","Data":"2bd9150547e5fa427f40b1c109c6261cef08a17502ba89332a007db01c1267b6"} Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.207031 4733 scope.go:117] "RemoveContainer" containerID="24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.224524 4733 scope.go:117] "RemoveContainer" containerID="0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.241683 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.258449 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfbp6"] Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.270882 4733 scope.go:117] "RemoveContainer" containerID="a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.293621 4733 scope.go:117] "RemoveContainer" containerID="24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6" Dec 06 06:09:39 crc kubenswrapper[4733]: E1206 06:09:39.294158 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6\": container with ID starting with 24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6 not found: ID does not exist" containerID="24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.294223 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6"} err="failed to get container status \"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6\": rpc error: code = NotFound desc = could not find container \"24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6\": container with ID starting with 24037252abcd56ff4c390959eefce25c6a2ce39310fe4cb31c2759c6ec8dd0f6 not found: ID does not exist" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.294264 4733 scope.go:117] "RemoveContainer" containerID="0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634" Dec 06 06:09:39 crc kubenswrapper[4733]: E1206 06:09:39.294886 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634\": container with ID starting with 0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634 not found: ID does not exist" containerID="0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.294933 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634"} err="failed to get container status \"0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634\": rpc error: code = NotFound desc = could not find container \"0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634\": container with ID starting with 0bdcf28d9c7953a72e1e59a275c959d22e53cc02871be04f27fb64397f9f4634 not found: ID does not exist" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.294966 4733 scope.go:117] "RemoveContainer" containerID="a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53" Dec 06 06:09:39 crc kubenswrapper[4733]: E1206 06:09:39.295486 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53\": container with ID starting with a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53 not found: ID does not exist" containerID="a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53" Dec 06 06:09:39 crc kubenswrapper[4733]: I1206 06:09:39.295512 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53"} err="failed to get container status \"a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53\": rpc error: code = NotFound desc = could not find container \"a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53\": container with ID starting with a3bf30ec725519dd8e6c4324e0adacf74a1ff496aa1f760d7d49b22affc52b53 not found: ID does not exist" Dec 06 06:09:40 crc kubenswrapper[4733]: I1206 06:09:40.497224 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" path="/var/lib/kubelet/pods/6fe10064-23b6-4b35-bcff-1f41e1776921/volumes" Dec 06 06:09:40 crc kubenswrapper[4733]: I1206 06:09:40.498490 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94621b3f-341c-4c06-8530-91d11c1ad8dc" path="/var/lib/kubelet/pods/94621b3f-341c-4c06-8530-91d11c1ad8dc/volumes" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.169405 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:44 crc kubenswrapper[4733]: E1206 06:09:44.170337 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="extract-content" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.170352 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="extract-content" Dec 06 06:09:44 crc kubenswrapper[4733]: E1206 06:09:44.170390 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="extract-utilities" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.170396 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="extract-utilities" Dec 06 06:09:44 crc kubenswrapper[4733]: E1206 06:09:44.170405 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="registry-server" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.170410 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="registry-server" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.170643 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe10064-23b6-4b35-bcff-1f41e1776921" containerName="registry-server" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.171988 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.179570 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.199968 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.200040 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.200152 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8497p\" (UniqueName: \"kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.301949 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.302061 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.302192 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8497p\" (UniqueName: \"kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.302461 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.302536 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.319457 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8497p\" (UniqueName: \"kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p\") pod \"certified-operators-cv6cs\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.491224 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:44 crc kubenswrapper[4733]: I1206 06:09:44.935733 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:45 crc kubenswrapper[4733]: I1206 06:09:45.273185 4733 generic.go:334] "Generic (PLEG): container finished" podID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerID="7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8" exitCode=0 Dec 06 06:09:45 crc kubenswrapper[4733]: I1206 06:09:45.273288 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerDied","Data":"7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8"} Dec 06 06:09:45 crc kubenswrapper[4733]: I1206 06:09:45.273634 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerStarted","Data":"8645fc30068c19937ea7e6ad12745d9919890c8ab4c728e68530086ae7ff3689"} Dec 06 06:09:46 crc kubenswrapper[4733]: I1206 06:09:46.287910 4733 generic.go:334] "Generic (PLEG): container finished" podID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerID="fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff" exitCode=0 Dec 06 06:09:46 crc kubenswrapper[4733]: I1206 06:09:46.287978 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerDied","Data":"fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff"} Dec 06 06:09:47 crc kubenswrapper[4733]: I1206 06:09:47.302076 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerStarted","Data":"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81"} Dec 06 06:09:47 crc kubenswrapper[4733]: I1206 06:09:47.331099 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cv6cs" podStartSLOduration=1.835806867 podStartE2EDuration="3.33107673s" podCreationTimestamp="2025-12-06 06:09:44 +0000 UTC" firstStartedPulling="2025-12-06 06:09:45.275014908 +0000 UTC m=+1569.140226018" lastFinishedPulling="2025-12-06 06:09:46.77028477 +0000 UTC m=+1570.635495881" observedRunningTime="2025-12-06 06:09:47.324267768 +0000 UTC m=+1571.189478879" watchObservedRunningTime="2025-12-06 06:09:47.33107673 +0000 UTC m=+1571.196287841" Dec 06 06:09:48 crc kubenswrapper[4733]: I1206 06:09:48.340766 4733 generic.go:334] "Generic (PLEG): container finished" podID="833ce9dd-3791-4da1-9f16-fb8db6d4c205" containerID="4e4984f24eec946ab22cc61278c48ebcb9636ddc7265373c10b7cde7d0f41b5a" exitCode=0 Dec 06 06:09:48 crc kubenswrapper[4733]: I1206 06:09:48.340882 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" event={"ID":"833ce9dd-3791-4da1-9f16-fb8db6d4c205","Type":"ContainerDied","Data":"4e4984f24eec946ab22cc61278c48ebcb9636ddc7265373c10b7cde7d0f41b5a"} Dec 06 06:09:48 crc kubenswrapper[4733]: I1206 06:09:48.903597 4733 scope.go:117] "RemoveContainer" containerID="4afeeabadadf2184b6fa6d39c3633532a3b3159105f408e610e36b8c04a6e386" Dec 06 06:09:48 crc kubenswrapper[4733]: I1206 06:09:48.930191 4733 scope.go:117] "RemoveContainer" containerID="2ce800bc38e2b250f951b941ae6274be8ddb46bea923a9310929d86a39df6ed7" Dec 06 06:09:48 crc kubenswrapper[4733]: I1206 06:09:48.979275 4733 scope.go:117] "RemoveContainer" containerID="f5bf65b4769e24ee50da36f59cce71691a86fb0235331637b7918c385a4b3d03" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.012684 4733 scope.go:117] "RemoveContainer" containerID="8054c6b414f7564652019519e29f98b783b0de4560cf4c7e536de2c7c65e204c" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.059337 4733 scope.go:117] "RemoveContainer" containerID="1892bac900c092ddf75c756cd581f5c719f1c7dfff71b720507688405fdd4c4d" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.080838 4733 scope.go:117] "RemoveContainer" containerID="5750aba6e8db5431cdd23b1d4e950f8f205015ea4f37fbeff6cd344a29a4bae1" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.129398 4733 scope.go:117] "RemoveContainer" containerID="a1e14f2c214fdf14361a7ec6ee8f1a2fa6578de72e8cb11775b194710d8b0ca5" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.176205 4733 scope.go:117] "RemoveContainer" containerID="19206309fec04412ba761c6c207a159b6dc569bf853150b4cec2ce4759bb33da" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.193497 4733 scope.go:117] "RemoveContainer" containerID="fd8358502a2ecea7390a3030de99a87d178869de8721351327526783c42c28ad" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.633244 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.826844 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory\") pod \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.826886 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key\") pod \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.826916 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plg46\" (UniqueName: \"kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46\") pod \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\" (UID: \"833ce9dd-3791-4da1-9f16-fb8db6d4c205\") " Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.833427 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46" (OuterVolumeSpecName: "kube-api-access-plg46") pod "833ce9dd-3791-4da1-9f16-fb8db6d4c205" (UID: "833ce9dd-3791-4da1-9f16-fb8db6d4c205"). InnerVolumeSpecName "kube-api-access-plg46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.854639 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "833ce9dd-3791-4da1-9f16-fb8db6d4c205" (UID: "833ce9dd-3791-4da1-9f16-fb8db6d4c205"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.856077 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory" (OuterVolumeSpecName: "inventory") pod "833ce9dd-3791-4da1-9f16-fb8db6d4c205" (UID: "833ce9dd-3791-4da1-9f16-fb8db6d4c205"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.930167 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.930206 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/833ce9dd-3791-4da1-9f16-fb8db6d4c205-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:49 crc kubenswrapper[4733]: I1206 06:09:49.930218 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plg46\" (UniqueName: \"kubernetes.io/projected/833ce9dd-3791-4da1-9f16-fb8db6d4c205-kube-api-access-plg46\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.379979 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" event={"ID":"833ce9dd-3791-4da1-9f16-fb8db6d4c205","Type":"ContainerDied","Data":"16081a58243206ba6a96b010751b6b0d7771a24c8e51eff32123511bd60b4821"} Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.380240 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16081a58243206ba6a96b010751b6b0d7771a24c8e51eff32123511bd60b4821" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.380064 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.436456 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6"] Dec 06 06:09:50 crc kubenswrapper[4733]: E1206 06:09:50.437028 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833ce9dd-3791-4da1-9f16-fb8db6d4c205" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.437058 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="833ce9dd-3791-4da1-9f16-fb8db6d4c205" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.437377 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="833ce9dd-3791-4da1-9f16-fb8db6d4c205" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.438270 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.441173 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.442091 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.444736 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.447524 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.457156 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6"] Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.543641 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqsvw\" (UniqueName: \"kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.543722 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.544025 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.646104 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.646221 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.646395 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqsvw\" (UniqueName: \"kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.650606 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.653048 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.662503 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqsvw\" (UniqueName: \"kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nndq6\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:50 crc kubenswrapper[4733]: I1206 06:09:50.759455 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:09:51 crc kubenswrapper[4733]: I1206 06:09:51.248034 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6"] Dec 06 06:09:51 crc kubenswrapper[4733]: I1206 06:09:51.392739 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" event={"ID":"f455cdaa-f9af-41b7-8bb3-379d347251ef","Type":"ContainerStarted","Data":"96a16e61e40d01480cd84e94d0b522a559363128ffe78aefb779b607ec4ae71c"} Dec 06 06:09:52 crc kubenswrapper[4733]: I1206 06:09:52.402535 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" event={"ID":"f455cdaa-f9af-41b7-8bb3-379d347251ef","Type":"ContainerStarted","Data":"3ebb0625368d47f9b085ddd365e3a70fc09f349a7f3b27dd985db8400bba22ec"} Dec 06 06:09:52 crc kubenswrapper[4733]: I1206 06:09:52.428162 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" podStartSLOduration=1.9615413739999998 podStartE2EDuration="2.428140794s" podCreationTimestamp="2025-12-06 06:09:50 +0000 UTC" firstStartedPulling="2025-12-06 06:09:51.253455618 +0000 UTC m=+1575.118666729" lastFinishedPulling="2025-12-06 06:09:51.720055038 +0000 UTC m=+1575.585266149" observedRunningTime="2025-12-06 06:09:52.419444442 +0000 UTC m=+1576.284655553" watchObservedRunningTime="2025-12-06 06:09:52.428140794 +0000 UTC m=+1576.293351905" Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.039623 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vfkdm"] Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.051830 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8ffc7"] Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.058361 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vfkdm"] Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.066052 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8ffc7"] Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.071105 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-n8dgk"] Dec 06 06:09:53 crc kubenswrapper[4733]: I1206 06:09:53.076251 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-n8dgk"] Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.494575 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363efbb6-18f2-440b-bffd-f64dee6a3af7" path="/var/lib/kubelet/pods/363efbb6-18f2-440b-bffd-f64dee6a3af7/volumes" Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.495600 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c" path="/var/lib/kubelet/pods/3c36ac6b-3ce1-48ac-a97a-0b2c7a5e988c/volumes" Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.496363 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707d8771-4a40-42e7-b4bd-c2a9090126f0" path="/var/lib/kubelet/pods/707d8771-4a40-42e7-b4bd-c2a9090126f0/volumes" Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.496912 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.496950 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:54 crc kubenswrapper[4733]: I1206 06:09:54.540465 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:55 crc kubenswrapper[4733]: I1206 06:09:55.031246 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4twth"] Dec 06 06:09:55 crc kubenswrapper[4733]: I1206 06:09:55.036677 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4twth"] Dec 06 06:09:55 crc kubenswrapper[4733]: I1206 06:09:55.467753 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:55 crc kubenswrapper[4733]: I1206 06:09:55.525331 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:56 crc kubenswrapper[4733]: I1206 06:09:56.551759 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a886728-ea9b-485c-844d-964614315b0d" path="/var/lib/kubelet/pods/6a886728-ea9b-485c-844d-964614315b0d/volumes" Dec 06 06:09:57 crc kubenswrapper[4733]: I1206 06:09:57.451841 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cv6cs" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="registry-server" containerID="cri-o://7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81" gracePeriod=2 Dec 06 06:09:57 crc kubenswrapper[4733]: I1206 06:09:57.866048 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.018436 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8497p\" (UniqueName: \"kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p\") pod \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.018615 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities\") pod \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.018841 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content\") pod \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\" (UID: \"65e92cbb-17e3-4f63-a6e2-237c0482e87b\") " Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.024095 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p" (OuterVolumeSpecName: "kube-api-access-8497p") pod "65e92cbb-17e3-4f63-a6e2-237c0482e87b" (UID: "65e92cbb-17e3-4f63-a6e2-237c0482e87b"). InnerVolumeSpecName "kube-api-access-8497p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.025399 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities" (OuterVolumeSpecName: "utilities") pod "65e92cbb-17e3-4f63-a6e2-237c0482e87b" (UID: "65e92cbb-17e3-4f63-a6e2-237c0482e87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.055050 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65e92cbb-17e3-4f63-a6e2-237c0482e87b" (UID: "65e92cbb-17e3-4f63-a6e2-237c0482e87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.121821 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8497p\" (UniqueName: \"kubernetes.io/projected/65e92cbb-17e3-4f63-a6e2-237c0482e87b-kube-api-access-8497p\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.121939 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.122000 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e92cbb-17e3-4f63-a6e2-237c0482e87b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.463658 4733 generic.go:334] "Generic (PLEG): container finished" podID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerID="7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81" exitCode=0 Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.463757 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cv6cs" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.463788 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerDied","Data":"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81"} Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.464730 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cv6cs" event={"ID":"65e92cbb-17e3-4f63-a6e2-237c0482e87b","Type":"ContainerDied","Data":"8645fc30068c19937ea7e6ad12745d9919890c8ab4c728e68530086ae7ff3689"} Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.464796 4733 scope.go:117] "RemoveContainer" containerID="7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.489694 4733 scope.go:117] "RemoveContainer" containerID="fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.508806 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.515831 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cv6cs"] Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.527248 4733 scope.go:117] "RemoveContainer" containerID="7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.565936 4733 scope.go:117] "RemoveContainer" containerID="7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81" Dec 06 06:09:58 crc kubenswrapper[4733]: E1206 06:09:58.566975 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81\": container with ID starting with 7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81 not found: ID does not exist" containerID="7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.567006 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81"} err="failed to get container status \"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81\": rpc error: code = NotFound desc = could not find container \"7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81\": container with ID starting with 7841530e9d7999c714cc57cee7f70f4142910234c767fa3f09fa27a661da1d81 not found: ID does not exist" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.567028 4733 scope.go:117] "RemoveContainer" containerID="fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff" Dec 06 06:09:58 crc kubenswrapper[4733]: E1206 06:09:58.567335 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff\": container with ID starting with fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff not found: ID does not exist" containerID="fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.567353 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff"} err="failed to get container status \"fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff\": rpc error: code = NotFound desc = could not find container \"fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff\": container with ID starting with fb13a07d188e2b059a4cb710c009c8c069db289646a815e25755686df605a7ff not found: ID does not exist" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.567367 4733 scope.go:117] "RemoveContainer" containerID="7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8" Dec 06 06:09:58 crc kubenswrapper[4733]: E1206 06:09:58.567995 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8\": container with ID starting with 7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8 not found: ID does not exist" containerID="7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8" Dec 06 06:09:58 crc kubenswrapper[4733]: I1206 06:09:58.568016 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8"} err="failed to get container status \"7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8\": rpc error: code = NotFound desc = could not find container \"7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8\": container with ID starting with 7b800de36d903151cbbfe3cc8283c1301ad98f9f52a4f30fec8f5cd7a1674ff8 not found: ID does not exist" Dec 06 06:10:00 crc kubenswrapper[4733]: I1206 06:10:00.494291 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" path="/var/lib/kubelet/pods/65e92cbb-17e3-4f63-a6e2-237c0482e87b/volumes" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.459370 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:39 crc kubenswrapper[4733]: E1206 06:10:39.460214 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="extract-content" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.460229 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="extract-content" Dec 06 06:10:39 crc kubenswrapper[4733]: E1206 06:10:39.460240 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="registry-server" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.460247 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="registry-server" Dec 06 06:10:39 crc kubenswrapper[4733]: E1206 06:10:39.460265 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="extract-utilities" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.460272 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="extract-utilities" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.460451 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e92cbb-17e3-4f63-a6e2-237c0482e87b" containerName="registry-server" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.461609 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.478450 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.560364 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.560453 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.560724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnrx\" (UniqueName: \"kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.662761 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.663136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.663183 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.663372 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnrx\" (UniqueName: \"kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.663861 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.681414 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnrx\" (UniqueName: \"kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx\") pod \"redhat-operators-4cr8r\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:39 crc kubenswrapper[4733]: I1206 06:10:39.793582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:40 crc kubenswrapper[4733]: I1206 06:10:40.199017 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:40 crc kubenswrapper[4733]: I1206 06:10:40.906023 4733 generic.go:334] "Generic (PLEG): container finished" podID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerID="1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363" exitCode=0 Dec 06 06:10:40 crc kubenswrapper[4733]: I1206 06:10:40.906124 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerDied","Data":"1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363"} Dec 06 06:10:40 crc kubenswrapper[4733]: I1206 06:10:40.907147 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerStarted","Data":"608a0c584b0aa3511c31450ffc65954ebdc0722b0db5b826bc38414aaefd3bc1"} Dec 06 06:10:41 crc kubenswrapper[4733]: I1206 06:10:41.921506 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerStarted","Data":"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10"} Dec 06 06:10:42 crc kubenswrapper[4733]: I1206 06:10:42.989470 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:10:42 crc kubenswrapper[4733]: I1206 06:10:42.989900 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:10:43 crc kubenswrapper[4733]: I1206 06:10:43.944284 4733 generic.go:334] "Generic (PLEG): container finished" podID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerID="29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10" exitCode=0 Dec 06 06:10:43 crc kubenswrapper[4733]: I1206 06:10:43.944350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerDied","Data":"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10"} Dec 06 06:10:44 crc kubenswrapper[4733]: I1206 06:10:44.959208 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerStarted","Data":"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291"} Dec 06 06:10:44 crc kubenswrapper[4733]: I1206 06:10:44.982361 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4cr8r" podStartSLOduration=2.428847377 podStartE2EDuration="5.982342678s" podCreationTimestamp="2025-12-06 06:10:39 +0000 UTC" firstStartedPulling="2025-12-06 06:10:40.9081454 +0000 UTC m=+1624.773356511" lastFinishedPulling="2025-12-06 06:10:44.461640701 +0000 UTC m=+1628.326851812" observedRunningTime="2025-12-06 06:10:44.977417627 +0000 UTC m=+1628.842628738" watchObservedRunningTime="2025-12-06 06:10:44.982342678 +0000 UTC m=+1628.847553789" Dec 06 06:10:45 crc kubenswrapper[4733]: I1206 06:10:45.971017 4733 generic.go:334] "Generic (PLEG): container finished" podID="f455cdaa-f9af-41b7-8bb3-379d347251ef" containerID="3ebb0625368d47f9b085ddd365e3a70fc09f349a7f3b27dd985db8400bba22ec" exitCode=0 Dec 06 06:10:45 crc kubenswrapper[4733]: I1206 06:10:45.971103 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" event={"ID":"f455cdaa-f9af-41b7-8bb3-379d347251ef","Type":"ContainerDied","Data":"3ebb0625368d47f9b085ddd365e3a70fc09f349a7f3b27dd985db8400bba22ec"} Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.332766 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.407080 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqsvw\" (UniqueName: \"kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw\") pod \"f455cdaa-f9af-41b7-8bb3-379d347251ef\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.407326 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory\") pod \"f455cdaa-f9af-41b7-8bb3-379d347251ef\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.407569 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key\") pod \"f455cdaa-f9af-41b7-8bb3-379d347251ef\" (UID: \"f455cdaa-f9af-41b7-8bb3-379d347251ef\") " Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.415692 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw" (OuterVolumeSpecName: "kube-api-access-mqsvw") pod "f455cdaa-f9af-41b7-8bb3-379d347251ef" (UID: "f455cdaa-f9af-41b7-8bb3-379d347251ef"). InnerVolumeSpecName "kube-api-access-mqsvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.430634 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f455cdaa-f9af-41b7-8bb3-379d347251ef" (UID: "f455cdaa-f9af-41b7-8bb3-379d347251ef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.431572 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory" (OuterVolumeSpecName: "inventory") pod "f455cdaa-f9af-41b7-8bb3-379d347251ef" (UID: "f455cdaa-f9af-41b7-8bb3-379d347251ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.510459 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.510496 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f455cdaa-f9af-41b7-8bb3-379d347251ef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.510506 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqsvw\" (UniqueName: \"kubernetes.io/projected/f455cdaa-f9af-41b7-8bb3-379d347251ef-kube-api-access-mqsvw\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.996009 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" event={"ID":"f455cdaa-f9af-41b7-8bb3-379d347251ef","Type":"ContainerDied","Data":"96a16e61e40d01480cd84e94d0b522a559363128ffe78aefb779b607ec4ae71c"} Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.996068 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96a16e61e40d01480cd84e94d0b522a559363128ffe78aefb779b607ec4ae71c" Dec 06 06:10:47 crc kubenswrapper[4733]: I1206 06:10:47.996087 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nndq6" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.063730 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd"] Dec 06 06:10:48 crc kubenswrapper[4733]: E1206 06:10:48.064177 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f455cdaa-f9af-41b7-8bb3-379d347251ef" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.064199 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f455cdaa-f9af-41b7-8bb3-379d347251ef" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.064685 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f455cdaa-f9af-41b7-8bb3-379d347251ef" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.067617 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.069738 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.070054 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.070196 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.071098 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.072672 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd"] Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.123233 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.123285 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.123632 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxgz\" (UniqueName: \"kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.224904 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxgz\" (UniqueName: \"kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.225215 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.225241 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.228500 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.229281 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.241496 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxgz\" (UniqueName: \"kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.388103 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:48 crc kubenswrapper[4733]: I1206 06:10:48.919364 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd"] Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.007444 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" event={"ID":"27d67df7-7cb0-4c5b-ba49-00d9285e1e11","Type":"ContainerStarted","Data":"2ca3a27d1a739105bea292341e7884965ec67e406d6b9fc28564c709a5c90a25"} Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.389526 4733 scope.go:117] "RemoveContainer" containerID="5507ac46044992ff14b7680c2d1b081f93275189c934a95c96b480bc08b4e154" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.439789 4733 scope.go:117] "RemoveContainer" containerID="2d7bf11943f17abd016cd025b1a22748e5ae393cd0598a7b1659f8b3a6fa6f3c" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.464656 4733 scope.go:117] "RemoveContainer" containerID="d1f5a6ed6b68bb0115a0cd1a5a5f9ab57234e6da84cef20890527aef598c0df9" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.555415 4733 scope.go:117] "RemoveContainer" containerID="f3c5a5a99dadaafae87ae1fdd011992e264df1cfa144803ecf8b4458d2785aad" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.794017 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.794082 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:49 crc kubenswrapper[4733]: I1206 06:10:49.837034 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:50 crc kubenswrapper[4733]: I1206 06:10:50.020559 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" event={"ID":"27d67df7-7cb0-4c5b-ba49-00d9285e1e11","Type":"ContainerStarted","Data":"afb6c7c2e07c4fcab5ba52c28ed143713b198b03046b48a373ccb105b749d653"} Dec 06 06:10:50 crc kubenswrapper[4733]: I1206 06:10:50.043992 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" podStartSLOduration=1.515836525 podStartE2EDuration="2.043974755s" podCreationTimestamp="2025-12-06 06:10:48 +0000 UTC" firstStartedPulling="2025-12-06 06:10:48.924056825 +0000 UTC m=+1632.789267936" lastFinishedPulling="2025-12-06 06:10:49.452195055 +0000 UTC m=+1633.317406166" observedRunningTime="2025-12-06 06:10:50.033287751 +0000 UTC m=+1633.898498862" watchObservedRunningTime="2025-12-06 06:10:50.043974755 +0000 UTC m=+1633.909185866" Dec 06 06:10:50 crc kubenswrapper[4733]: I1206 06:10:50.066340 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:50 crc kubenswrapper[4733]: I1206 06:10:50.112607 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.040225 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4cr8r" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="registry-server" containerID="cri-o://ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291" gracePeriod=2 Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.538180 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.711835 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities\") pod \"a1e0202d-72c4-4b30-ae27-ed0477613d09\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.712142 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content\") pod \"a1e0202d-72c4-4b30-ae27-ed0477613d09\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.712316 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnrx\" (UniqueName: \"kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx\") pod \"a1e0202d-72c4-4b30-ae27-ed0477613d09\" (UID: \"a1e0202d-72c4-4b30-ae27-ed0477613d09\") " Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.713287 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities" (OuterVolumeSpecName: "utilities") pod "a1e0202d-72c4-4b30-ae27-ed0477613d09" (UID: "a1e0202d-72c4-4b30-ae27-ed0477613d09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.714620 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.718018 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx" (OuterVolumeSpecName: "kube-api-access-8nnrx") pod "a1e0202d-72c4-4b30-ae27-ed0477613d09" (UID: "a1e0202d-72c4-4b30-ae27-ed0477613d09"). InnerVolumeSpecName "kube-api-access-8nnrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.800522 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1e0202d-72c4-4b30-ae27-ed0477613d09" (UID: "a1e0202d-72c4-4b30-ae27-ed0477613d09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.817252 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1e0202d-72c4-4b30-ae27-ed0477613d09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:52 crc kubenswrapper[4733]: I1206 06:10:52.817284 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnrx\" (UniqueName: \"kubernetes.io/projected/a1e0202d-72c4-4b30-ae27-ed0477613d09-kube-api-access-8nnrx\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.050592 4733 generic.go:334] "Generic (PLEG): container finished" podID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerID="ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291" exitCode=0 Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.050663 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerDied","Data":"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291"} Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.051931 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4cr8r" event={"ID":"a1e0202d-72c4-4b30-ae27-ed0477613d09","Type":"ContainerDied","Data":"608a0c584b0aa3511c31450ffc65954ebdc0722b0db5b826bc38414aaefd3bc1"} Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.050704 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4cr8r" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.051983 4733 scope.go:117] "RemoveContainer" containerID="ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.083131 4733 scope.go:117] "RemoveContainer" containerID="29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.113170 4733 scope.go:117] "RemoveContainer" containerID="1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.115856 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.122327 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4cr8r"] Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.142279 4733 scope.go:117] "RemoveContainer" containerID="ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291" Dec 06 06:10:53 crc kubenswrapper[4733]: E1206 06:10:53.143343 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291\": container with ID starting with ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291 not found: ID does not exist" containerID="ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.143389 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291"} err="failed to get container status \"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291\": rpc error: code = NotFound desc = could not find container \"ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291\": container with ID starting with ba9bfa5ffeb5ea2eb9e43bd28b292bbf64582acce5dcb9f5bc19be3d95611291 not found: ID does not exist" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.143420 4733 scope.go:117] "RemoveContainer" containerID="29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10" Dec 06 06:10:53 crc kubenswrapper[4733]: E1206 06:10:53.143884 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10\": container with ID starting with 29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10 not found: ID does not exist" containerID="29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.143919 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10"} err="failed to get container status \"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10\": rpc error: code = NotFound desc = could not find container \"29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10\": container with ID starting with 29534f21d69f67a702308ea1c06401059d808c8c58771b531fb0c95fcd60cf10 not found: ID does not exist" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.143951 4733 scope.go:117] "RemoveContainer" containerID="1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363" Dec 06 06:10:53 crc kubenswrapper[4733]: E1206 06:10:53.144320 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363\": container with ID starting with 1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363 not found: ID does not exist" containerID="1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363" Dec 06 06:10:53 crc kubenswrapper[4733]: I1206 06:10:53.144355 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363"} err="failed to get container status \"1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363\": rpc error: code = NotFound desc = could not find container \"1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363\": container with ID starting with 1910904dd27641995100807b014f1e9c8f09ce0f945e0ab817fcf4a29955d363 not found: ID does not exist" Dec 06 06:10:53 crc kubenswrapper[4733]: E1206 06:10:53.200046 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1e0202d_72c4_4b30_ae27_ed0477613d09.slice/crio-608a0c584b0aa3511c31450ffc65954ebdc0722b0db5b826bc38414aaefd3bc1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1e0202d_72c4_4b30_ae27_ed0477613d09.slice\": RecentStats: unable to find data in memory cache]" Dec 06 06:10:54 crc kubenswrapper[4733]: I1206 06:10:54.065133 4733 generic.go:334] "Generic (PLEG): container finished" podID="27d67df7-7cb0-4c5b-ba49-00d9285e1e11" containerID="afb6c7c2e07c4fcab5ba52c28ed143713b198b03046b48a373ccb105b749d653" exitCode=0 Dec 06 06:10:54 crc kubenswrapper[4733]: I1206 06:10:54.065232 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" event={"ID":"27d67df7-7cb0-4c5b-ba49-00d9285e1e11","Type":"ContainerDied","Data":"afb6c7c2e07c4fcab5ba52c28ed143713b198b03046b48a373ccb105b749d653"} Dec 06 06:10:54 crc kubenswrapper[4733]: I1206 06:10:54.494439 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" path="/var/lib/kubelet/pods/a1e0202d-72c4-4b30-ae27-ed0477613d09/volumes" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.413094 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.572098 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory\") pod \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.572173 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key\") pod \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.572392 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbxgz\" (UniqueName: \"kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz\") pod \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\" (UID: \"27d67df7-7cb0-4c5b-ba49-00d9285e1e11\") " Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.584702 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz" (OuterVolumeSpecName: "kube-api-access-qbxgz") pod "27d67df7-7cb0-4c5b-ba49-00d9285e1e11" (UID: "27d67df7-7cb0-4c5b-ba49-00d9285e1e11"). InnerVolumeSpecName "kube-api-access-qbxgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.601250 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory" (OuterVolumeSpecName: "inventory") pod "27d67df7-7cb0-4c5b-ba49-00d9285e1e11" (UID: "27d67df7-7cb0-4c5b-ba49-00d9285e1e11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.601930 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "27d67df7-7cb0-4c5b-ba49-00d9285e1e11" (UID: "27d67df7-7cb0-4c5b-ba49-00d9285e1e11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.676018 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbxgz\" (UniqueName: \"kubernetes.io/projected/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-kube-api-access-qbxgz\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.676228 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:55 crc kubenswrapper[4733]: I1206 06:10:55.676238 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27d67df7-7cb0-4c5b-ba49-00d9285e1e11-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.088145 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" event={"ID":"27d67df7-7cb0-4c5b-ba49-00d9285e1e11","Type":"ContainerDied","Data":"2ca3a27d1a739105bea292341e7884965ec67e406d6b9fc28564c709a5c90a25"} Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.088203 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.088207 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca3a27d1a739105bea292341e7884965ec67e406d6b9fc28564c709a5c90a25" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146215 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg"] Dec 06 06:10:56 crc kubenswrapper[4733]: E1206 06:10:56.146635 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="extract-content" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146654 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="extract-content" Dec 06 06:10:56 crc kubenswrapper[4733]: E1206 06:10:56.146679 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="registry-server" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146686 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="registry-server" Dec 06 06:10:56 crc kubenswrapper[4733]: E1206 06:10:56.146698 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="extract-utilities" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146704 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="extract-utilities" Dec 06 06:10:56 crc kubenswrapper[4733]: E1206 06:10:56.146721 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d67df7-7cb0-4c5b-ba49-00d9285e1e11" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146728 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d67df7-7cb0-4c5b-ba49-00d9285e1e11" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146902 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d67df7-7cb0-4c5b-ba49-00d9285e1e11" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.146922 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e0202d-72c4-4b30-ae27-ed0477613d09" containerName="registry-server" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.159039 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.164821 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.165562 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.165791 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.165791 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.181891 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg"] Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.300910 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.301092 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.301127 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78qk\" (UniqueName: \"kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.402904 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.403242 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78qk\" (UniqueName: \"kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.403544 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.407354 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.407370 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.419815 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78qk\" (UniqueName: \"kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk2mg\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.478518 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:10:56 crc kubenswrapper[4733]: I1206 06:10:56.942315 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg"] Dec 06 06:10:57 crc kubenswrapper[4733]: I1206 06:10:57.100465 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" event={"ID":"d0b060af-fc34-4b9f-ad66-0ebcd23e5146","Type":"ContainerStarted","Data":"d803acc5835cc6e477420cc36f9be1c3cd3f23ce397a9b6048f52d926d6dd45e"} Dec 06 06:10:58 crc kubenswrapper[4733]: I1206 06:10:58.111012 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" event={"ID":"d0b060af-fc34-4b9f-ad66-0ebcd23e5146","Type":"ContainerStarted","Data":"5ecebd4f5e2d02c3ddc0bd33a64448e4065825241a9df6741cdfcf028abdc5c3"} Dec 06 06:10:58 crc kubenswrapper[4733]: I1206 06:10:58.127559 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" podStartSLOduration=1.6571819890000001 podStartE2EDuration="2.1275418s" podCreationTimestamp="2025-12-06 06:10:56 +0000 UTC" firstStartedPulling="2025-12-06 06:10:56.945428758 +0000 UTC m=+1640.810639869" lastFinishedPulling="2025-12-06 06:10:57.41578858 +0000 UTC m=+1641.280999680" observedRunningTime="2025-12-06 06:10:58.12651496 +0000 UTC m=+1641.991726070" watchObservedRunningTime="2025-12-06 06:10:58.1275418 +0000 UTC m=+1641.992752911" Dec 06 06:11:00 crc kubenswrapper[4733]: I1206 06:11:00.040834 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7733-account-create-update-tvk6f"] Dec 06 06:11:00 crc kubenswrapper[4733]: I1206 06:11:00.051050 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7733-account-create-update-tvk6f"] Dec 06 06:11:00 crc kubenswrapper[4733]: I1206 06:11:00.555622 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88b7105-3d6e-44a7-965a-1fa56297537b" path="/var/lib/kubelet/pods/b88b7105-3d6e-44a7-965a-1fa56297537b/volumes" Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.030576 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pz8w7"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.038898 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0451-account-create-update-kldhp"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.048051 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pz8w7"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.059578 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-csq6r"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.067662 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0451-account-create-update-kldhp"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.074795 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hn5h8"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.079779 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-csq6r"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.084599 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hn5h8"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.089428 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-be38-account-create-update-gbx8w"] Dec 06 06:11:01 crc kubenswrapper[4733]: I1206 06:11:01.094041 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-be38-account-create-update-gbx8w"] Dec 06 06:11:02 crc kubenswrapper[4733]: I1206 06:11:02.496954 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328eb573-b5aa-494f-bc1d-588c43edcb9f" path="/var/lib/kubelet/pods/328eb573-b5aa-494f-bc1d-588c43edcb9f/volumes" Dec 06 06:11:02 crc kubenswrapper[4733]: I1206 06:11:02.497616 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8" path="/var/lib/kubelet/pods/44d743aa-a7d8-4b31-a1f1-35ad3b97b8e8/volumes" Dec 06 06:11:02 crc kubenswrapper[4733]: I1206 06:11:02.498477 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf31d052-0c6b-4d46-8d72-8c7e247aad00" path="/var/lib/kubelet/pods/bf31d052-0c6b-4d46-8d72-8c7e247aad00/volumes" Dec 06 06:11:02 crc kubenswrapper[4733]: I1206 06:11:02.499226 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e651b5d7-8b07-42b1-9189-af0896ea5a16" path="/var/lib/kubelet/pods/e651b5d7-8b07-42b1-9189-af0896ea5a16/volumes" Dec 06 06:11:02 crc kubenswrapper[4733]: I1206 06:11:02.499999 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2e9091-1bb1-4930-bff8-35c3cced4104" path="/var/lib/kubelet/pods/ee2e9091-1bb1-4930-bff8-35c3cced4104/volumes" Dec 06 06:11:12 crc kubenswrapper[4733]: I1206 06:11:12.989099 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:11:12 crc kubenswrapper[4733]: I1206 06:11:12.989758 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:11:23 crc kubenswrapper[4733]: E1206 06:11:23.872871 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b060af_fc34_4b9f_ad66_0ebcd23e5146.slice/crio-5ecebd4f5e2d02c3ddc0bd33a64448e4065825241a9df6741cdfcf028abdc5c3.scope\": RecentStats: unable to find data in memory cache]" Dec 06 06:11:24 crc kubenswrapper[4733]: I1206 06:11:24.380824 4733 generic.go:334] "Generic (PLEG): container finished" podID="d0b060af-fc34-4b9f-ad66-0ebcd23e5146" containerID="5ecebd4f5e2d02c3ddc0bd33a64448e4065825241a9df6741cdfcf028abdc5c3" exitCode=0 Dec 06 06:11:24 crc kubenswrapper[4733]: I1206 06:11:24.380898 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" event={"ID":"d0b060af-fc34-4b9f-ad66-0ebcd23e5146","Type":"ContainerDied","Data":"5ecebd4f5e2d02c3ddc0bd33a64448e4065825241a9df6741cdfcf028abdc5c3"} Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.746924 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.919204 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m78qk\" (UniqueName: \"kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk\") pod \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.919688 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory\") pod \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.919765 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key\") pod \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\" (UID: \"d0b060af-fc34-4b9f-ad66-0ebcd23e5146\") " Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.925932 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk" (OuterVolumeSpecName: "kube-api-access-m78qk") pod "d0b060af-fc34-4b9f-ad66-0ebcd23e5146" (UID: "d0b060af-fc34-4b9f-ad66-0ebcd23e5146"). InnerVolumeSpecName "kube-api-access-m78qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.947134 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0b060af-fc34-4b9f-ad66-0ebcd23e5146" (UID: "d0b060af-fc34-4b9f-ad66-0ebcd23e5146"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:11:25 crc kubenswrapper[4733]: I1206 06:11:25.948042 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory" (OuterVolumeSpecName: "inventory") pod "d0b060af-fc34-4b9f-ad66-0ebcd23e5146" (UID: "d0b060af-fc34-4b9f-ad66-0ebcd23e5146"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.022881 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m78qk\" (UniqueName: \"kubernetes.io/projected/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-kube-api-access-m78qk\") on node \"crc\" DevicePath \"\"" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.022919 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.022931 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b060af-fc34-4b9f-ad66-0ebcd23e5146-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.404686 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" event={"ID":"d0b060af-fc34-4b9f-ad66-0ebcd23e5146","Type":"ContainerDied","Data":"d803acc5835cc6e477420cc36f9be1c3cd3f23ce397a9b6048f52d926d6dd45e"} Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.404741 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d803acc5835cc6e477420cc36f9be1c3cd3f23ce397a9b6048f52d926d6dd45e" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.404808 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk2mg" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.497018 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k"] Dec 06 06:11:26 crc kubenswrapper[4733]: E1206 06:11:26.497528 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b060af-fc34-4b9f-ad66-0ebcd23e5146" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.497554 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b060af-fc34-4b9f-ad66-0ebcd23e5146" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.497869 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b060af-fc34-4b9f-ad66-0ebcd23e5146" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.501186 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.505607 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.505813 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.505939 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.507134 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.514944 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k"] Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.530576 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5xq\" (UniqueName: \"kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.530722 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.530755 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.633560 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.633627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.633751 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5xq\" (UniqueName: \"kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.638646 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.639614 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.650164 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5xq\" (UniqueName: \"kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w268k\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:26 crc kubenswrapper[4733]: I1206 06:11:26.828388 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:11:27 crc kubenswrapper[4733]: I1206 06:11:27.043878 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gn769"] Dec 06 06:11:27 crc kubenswrapper[4733]: I1206 06:11:27.051578 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gn769"] Dec 06 06:11:27 crc kubenswrapper[4733]: I1206 06:11:27.325883 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k"] Dec 06 06:11:27 crc kubenswrapper[4733]: I1206 06:11:27.416905 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" event={"ID":"2de963da-76cb-41fe-9761-8eb801b393a9","Type":"ContainerStarted","Data":"440f8e7eca67f17a56f7f2eb2618b94e232a6db45215c4555929eb81e68fef3e"} Dec 06 06:11:28 crc kubenswrapper[4733]: I1206 06:11:28.426843 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" event={"ID":"2de963da-76cb-41fe-9761-8eb801b393a9","Type":"ContainerStarted","Data":"abbd65afd127dfb42044eddb11db92d4b1dc223b37d715631060c18541aa2dcd"} Dec 06 06:11:28 crc kubenswrapper[4733]: I1206 06:11:28.443864 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" podStartSLOduration=1.9027730040000002 podStartE2EDuration="2.443771172s" podCreationTimestamp="2025-12-06 06:11:26 +0000 UTC" firstStartedPulling="2025-12-06 06:11:27.339000453 +0000 UTC m=+1671.204211565" lastFinishedPulling="2025-12-06 06:11:27.879998622 +0000 UTC m=+1671.745209733" observedRunningTime="2025-12-06 06:11:28.44259918 +0000 UTC m=+1672.307810290" watchObservedRunningTime="2025-12-06 06:11:28.443771172 +0000 UTC m=+1672.308982284" Dec 06 06:11:28 crc kubenswrapper[4733]: I1206 06:11:28.494397 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8505186d-6d93-4df5-8f82-4b7747029d79" path="/var/lib/kubelet/pods/8505186d-6d93-4df5-8f82-4b7747029d79/volumes" Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.041191 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-22n6m"] Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.063183 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-22n6m"] Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.497630 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b31ce7f-2712-4b95-bc6a-c52f3e104e12" path="/var/lib/kubelet/pods/3b31ce7f-2712-4b95-bc6a-c52f3e104e12/volumes" Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.989179 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.989247 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.989297 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.990069 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:11:42 crc kubenswrapper[4733]: I1206 06:11:42.990134 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" gracePeriod=600 Dec 06 06:11:43 crc kubenswrapper[4733]: E1206 06:11:43.107868 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:11:43 crc kubenswrapper[4733]: I1206 06:11:43.585349 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" exitCode=0 Dec 06 06:11:43 crc kubenswrapper[4733]: I1206 06:11:43.585423 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35"} Dec 06 06:11:43 crc kubenswrapper[4733]: I1206 06:11:43.585472 4733 scope.go:117] "RemoveContainer" containerID="e047d5177f0f7fa0184a2c2e021a17064100f39c60892814195f10cb7a9620d9" Dec 06 06:11:43 crc kubenswrapper[4733]: I1206 06:11:43.586238 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:11:43 crc kubenswrapper[4733]: E1206 06:11:43.586667 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:11:44 crc kubenswrapper[4733]: I1206 06:11:44.032327 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-t7j5b"] Dec 06 06:11:44 crc kubenswrapper[4733]: I1206 06:11:44.044288 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-t7j5b"] Dec 06 06:11:44 crc kubenswrapper[4733]: I1206 06:11:44.495008 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ff2c7d-6450-448f-8824-36d2b8ea0710" path="/var/lib/kubelet/pods/f3ff2c7d-6450-448f-8824-36d2b8ea0710/volumes" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.694136 4733 scope.go:117] "RemoveContainer" containerID="55126a0002463037c30d301c1e006f0c3f307c99c7c328c4d063526c14287153" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.714944 4733 scope.go:117] "RemoveContainer" containerID="b4d70349bed9c7c1e08cebcf962e114e20b1616ced83cf951b92d4a8bcedfe26" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.765885 4733 scope.go:117] "RemoveContainer" containerID="ec5863bb23282b5b1623a103e6b2063fa3ecdb799c7f528c47cef4548d78a72b" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.788091 4733 scope.go:117] "RemoveContainer" containerID="936611e4791a5c143add94e144fc299803bc0585f65fd4a155c0cf5a56e13e78" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.817687 4733 scope.go:117] "RemoveContainer" containerID="7d105fb40273c51290de37d9c39ef018dedc6bb34abbbb1a6b76800e15ad13c1" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.840455 4733 scope.go:117] "RemoveContainer" containerID="3bff34d0c82b79f2d616285fd496a585025729765a2518a41b386d095c548d8f" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.883117 4733 scope.go:117] "RemoveContainer" containerID="499e589efa01ec0ad5894246125f238587c9654498a502d3346649c989f42fb5" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.905744 4733 scope.go:117] "RemoveContainer" containerID="bb806fa724e81e863e82b0d9a73d29bde6697f64a38350cb9df031d384a0183e" Dec 06 06:11:49 crc kubenswrapper[4733]: I1206 06:11:49.954121 4733 scope.go:117] "RemoveContainer" containerID="1d6385669e87ea345500fce141ed8552836bf3b4451aeed09496748a392cb62d" Dec 06 06:11:55 crc kubenswrapper[4733]: I1206 06:11:55.485464 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:11:55 crc kubenswrapper[4733]: E1206 06:11:55.486110 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:12:03 crc kubenswrapper[4733]: I1206 06:12:03.780164 4733 generic.go:334] "Generic (PLEG): container finished" podID="2de963da-76cb-41fe-9761-8eb801b393a9" containerID="abbd65afd127dfb42044eddb11db92d4b1dc223b37d715631060c18541aa2dcd" exitCode=0 Dec 06 06:12:03 crc kubenswrapper[4733]: I1206 06:12:03.780254 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" event={"ID":"2de963da-76cb-41fe-9761-8eb801b393a9","Type":"ContainerDied","Data":"abbd65afd127dfb42044eddb11db92d4b1dc223b37d715631060c18541aa2dcd"} Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.113765 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.152752 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory\") pod \"2de963da-76cb-41fe-9761-8eb801b393a9\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.152829 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key\") pod \"2de963da-76cb-41fe-9761-8eb801b393a9\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.152905 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt5xq\" (UniqueName: \"kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq\") pod \"2de963da-76cb-41fe-9761-8eb801b393a9\" (UID: \"2de963da-76cb-41fe-9761-8eb801b393a9\") " Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.160437 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq" (OuterVolumeSpecName: "kube-api-access-pt5xq") pod "2de963da-76cb-41fe-9761-8eb801b393a9" (UID: "2de963da-76cb-41fe-9761-8eb801b393a9"). InnerVolumeSpecName "kube-api-access-pt5xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.179396 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2de963da-76cb-41fe-9761-8eb801b393a9" (UID: "2de963da-76cb-41fe-9761-8eb801b393a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.181102 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory" (OuterVolumeSpecName: "inventory") pod "2de963da-76cb-41fe-9761-8eb801b393a9" (UID: "2de963da-76cb-41fe-9761-8eb801b393a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.256242 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.256284 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2de963da-76cb-41fe-9761-8eb801b393a9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.256294 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt5xq\" (UniqueName: \"kubernetes.io/projected/2de963da-76cb-41fe-9761-8eb801b393a9-kube-api-access-pt5xq\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.806202 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" event={"ID":"2de963da-76cb-41fe-9761-8eb801b393a9","Type":"ContainerDied","Data":"440f8e7eca67f17a56f7f2eb2618b94e232a6db45215c4555929eb81e68fef3e"} Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.806521 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440f8e7eca67f17a56f7f2eb2618b94e232a6db45215c4555929eb81e68fef3e" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.806274 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w268k" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.878527 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54h4l"] Dec 06 06:12:05 crc kubenswrapper[4733]: E1206 06:12:05.879196 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de963da-76cb-41fe-9761-8eb801b393a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.879227 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de963da-76cb-41fe-9761-8eb801b393a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.879541 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de963da-76cb-41fe-9761-8eb801b393a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.880578 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.884707 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.884933 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.885048 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.885979 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.899596 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54h4l"] Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.971039 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.971599 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nn2q\" (UniqueName: \"kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:05 crc kubenswrapper[4733]: I1206 06:12:05.971825 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.073326 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.073601 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nn2q\" (UniqueName: \"kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.073728 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.078995 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.079156 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.092549 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nn2q\" (UniqueName: \"kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q\") pod \"ssh-known-hosts-edpm-deployment-54h4l\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.198484 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.690413 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54h4l"] Dec 06 06:12:06 crc kubenswrapper[4733]: I1206 06:12:06.817363 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" event={"ID":"a3c3e208-9936-4b7d-b7f4-73683f20fc47","Type":"ContainerStarted","Data":"6d80f5e058dbec11d327e8e71f6cc487c0c7fecaa181f52a739077d8a01e3de4"} Dec 06 06:12:07 crc kubenswrapper[4733]: I1206 06:12:07.827204 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" event={"ID":"a3c3e208-9936-4b7d-b7f4-73683f20fc47","Type":"ContainerStarted","Data":"da90d349cdd16aad2ee69635965e41f100296b77445111815e2164031e367161"} Dec 06 06:12:07 crc kubenswrapper[4733]: I1206 06:12:07.852435 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" podStartSLOduration=2.324875999 podStartE2EDuration="2.852407817s" podCreationTimestamp="2025-12-06 06:12:05 +0000 UTC" firstStartedPulling="2025-12-06 06:12:06.695871435 +0000 UTC m=+1710.561082547" lastFinishedPulling="2025-12-06 06:12:07.223403254 +0000 UTC m=+1711.088614365" observedRunningTime="2025-12-06 06:12:07.850523725 +0000 UTC m=+1711.715734835" watchObservedRunningTime="2025-12-06 06:12:07.852407817 +0000 UTC m=+1711.717618928" Dec 06 06:12:08 crc kubenswrapper[4733]: I1206 06:12:08.485040 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:12:08 crc kubenswrapper[4733]: E1206 06:12:08.485407 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:12:12 crc kubenswrapper[4733]: I1206 06:12:12.875606 4733 generic.go:334] "Generic (PLEG): container finished" podID="a3c3e208-9936-4b7d-b7f4-73683f20fc47" containerID="da90d349cdd16aad2ee69635965e41f100296b77445111815e2164031e367161" exitCode=0 Dec 06 06:12:12 crc kubenswrapper[4733]: I1206 06:12:12.875674 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" event={"ID":"a3c3e208-9936-4b7d-b7f4-73683f20fc47","Type":"ContainerDied","Data":"da90d349cdd16aad2ee69635965e41f100296b77445111815e2164031e367161"} Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.227791 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.240884 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0\") pod \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.240982 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nn2q\" (UniqueName: \"kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q\") pod \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.241214 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam\") pod \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\" (UID: \"a3c3e208-9936-4b7d-b7f4-73683f20fc47\") " Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.248703 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q" (OuterVolumeSpecName: "kube-api-access-6nn2q") pod "a3c3e208-9936-4b7d-b7f4-73683f20fc47" (UID: "a3c3e208-9936-4b7d-b7f4-73683f20fc47"). InnerVolumeSpecName "kube-api-access-6nn2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.266610 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a3c3e208-9936-4b7d-b7f4-73683f20fc47" (UID: "a3c3e208-9936-4b7d-b7f4-73683f20fc47"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.268469 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a3c3e208-9936-4b7d-b7f4-73683f20fc47" (UID: "a3c3e208-9936-4b7d-b7f4-73683f20fc47"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.345031 4733 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.345066 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nn2q\" (UniqueName: \"kubernetes.io/projected/a3c3e208-9936-4b7d-b7f4-73683f20fc47-kube-api-access-6nn2q\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.345079 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3c3e208-9936-4b7d-b7f4-73683f20fc47-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.893652 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" event={"ID":"a3c3e208-9936-4b7d-b7f4-73683f20fc47","Type":"ContainerDied","Data":"6d80f5e058dbec11d327e8e71f6cc487c0c7fecaa181f52a739077d8a01e3de4"} Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.893981 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d80f5e058dbec11d327e8e71f6cc487c0c7fecaa181f52a739077d8a01e3de4" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.893704 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54h4l" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.960477 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd"] Dec 06 06:12:14 crc kubenswrapper[4733]: E1206 06:12:14.960949 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c3e208-9936-4b7d-b7f4-73683f20fc47" containerName="ssh-known-hosts-edpm-deployment" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.960968 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c3e208-9936-4b7d-b7f4-73683f20fc47" containerName="ssh-known-hosts-edpm-deployment" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.961173 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c3e208-9936-4b7d-b7f4-73683f20fc47" containerName="ssh-known-hosts-edpm-deployment" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.961811 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.966011 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.966011 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.966717 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.971636 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd"] Dec 06 06:12:14 crc kubenswrapper[4733]: I1206 06:12:14.972622 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.058552 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.058622 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.058801 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv6lk\" (UniqueName: \"kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.161321 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.161378 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.161471 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv6lk\" (UniqueName: \"kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.166957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.167271 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.176982 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv6lk\" (UniqueName: \"kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bfgxd\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.283734 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.757428 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd"] Dec 06 06:12:15 crc kubenswrapper[4733]: I1206 06:12:15.905244 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" event={"ID":"a5943cb5-9495-43c5-8171-5c6a2df81c31","Type":"ContainerStarted","Data":"290de5bca7bd11863f23dff7fd7bc4448bac3a7ddc5d8da824399d536638596b"} Dec 06 06:12:16 crc kubenswrapper[4733]: I1206 06:12:16.917071 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" event={"ID":"a5943cb5-9495-43c5-8171-5c6a2df81c31","Type":"ContainerStarted","Data":"d8670c14956f922f9f63b3ed4d2d74a1ba33976ac62eee46136047dc47f6acf2"} Dec 06 06:12:16 crc kubenswrapper[4733]: I1206 06:12:16.939102 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" podStartSLOduration=2.3701415040000002 podStartE2EDuration="2.93908244s" podCreationTimestamp="2025-12-06 06:12:14 +0000 UTC" firstStartedPulling="2025-12-06 06:12:15.761161951 +0000 UTC m=+1719.626373063" lastFinishedPulling="2025-12-06 06:12:16.330102889 +0000 UTC m=+1720.195313999" observedRunningTime="2025-12-06 06:12:16.930415464 +0000 UTC m=+1720.795626574" watchObservedRunningTime="2025-12-06 06:12:16.93908244 +0000 UTC m=+1720.804293561" Dec 06 06:12:22 crc kubenswrapper[4733]: I1206 06:12:22.976381 4733 generic.go:334] "Generic (PLEG): container finished" podID="a5943cb5-9495-43c5-8171-5c6a2df81c31" containerID="d8670c14956f922f9f63b3ed4d2d74a1ba33976ac62eee46136047dc47f6acf2" exitCode=0 Dec 06 06:12:22 crc kubenswrapper[4733]: I1206 06:12:22.976481 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" event={"ID":"a5943cb5-9495-43c5-8171-5c6a2df81c31","Type":"ContainerDied","Data":"d8670c14956f922f9f63b3ed4d2d74a1ba33976ac62eee46136047dc47f6acf2"} Dec 06 06:12:23 crc kubenswrapper[4733]: I1206 06:12:23.485770 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:12:23 crc kubenswrapper[4733]: E1206 06:12:23.486450 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.323564 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.356948 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv6lk\" (UniqueName: \"kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk\") pod \"a5943cb5-9495-43c5-8171-5c6a2df81c31\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.357209 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory\") pod \"a5943cb5-9495-43c5-8171-5c6a2df81c31\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.357248 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key\") pod \"a5943cb5-9495-43c5-8171-5c6a2df81c31\" (UID: \"a5943cb5-9495-43c5-8171-5c6a2df81c31\") " Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.362489 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk" (OuterVolumeSpecName: "kube-api-access-jv6lk") pod "a5943cb5-9495-43c5-8171-5c6a2df81c31" (UID: "a5943cb5-9495-43c5-8171-5c6a2df81c31"). InnerVolumeSpecName "kube-api-access-jv6lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.379773 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory" (OuterVolumeSpecName: "inventory") pod "a5943cb5-9495-43c5-8171-5c6a2df81c31" (UID: "a5943cb5-9495-43c5-8171-5c6a2df81c31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.380174 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5943cb5-9495-43c5-8171-5c6a2df81c31" (UID: "a5943cb5-9495-43c5-8171-5c6a2df81c31"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.460450 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv6lk\" (UniqueName: \"kubernetes.io/projected/a5943cb5-9495-43c5-8171-5c6a2df81c31-kube-api-access-jv6lk\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.460485 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.460498 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5943cb5-9495-43c5-8171-5c6a2df81c31-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.994907 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" event={"ID":"a5943cb5-9495-43c5-8171-5c6a2df81c31","Type":"ContainerDied","Data":"290de5bca7bd11863f23dff7fd7bc4448bac3a7ddc5d8da824399d536638596b"} Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.994969 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290de5bca7bd11863f23dff7fd7bc4448bac3a7ddc5d8da824399d536638596b" Dec 06 06:12:24 crc kubenswrapper[4733]: I1206 06:12:24.994977 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bfgxd" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.057702 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq"] Dec 06 06:12:25 crc kubenswrapper[4733]: E1206 06:12:25.058208 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5943cb5-9495-43c5-8171-5c6a2df81c31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.058228 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5943cb5-9495-43c5-8171-5c6a2df81c31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.058476 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5943cb5-9495-43c5-8171-5c6a2df81c31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.059553 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.062898 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.063078 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.064405 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.067193 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.072215 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq"] Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.173701 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.173800 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfhj\" (UniqueName: \"kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.174328 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.276809 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzfhj\" (UniqueName: \"kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.277228 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.278547 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.281889 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.284881 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.291754 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzfhj\" (UniqueName: \"kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.374225 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:25 crc kubenswrapper[4733]: I1206 06:12:25.844472 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq"] Dec 06 06:12:26 crc kubenswrapper[4733]: I1206 06:12:26.006512 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" event={"ID":"e2a8649e-0504-47bc-8cea-c95c34f5e416","Type":"ContainerStarted","Data":"97dee28e979a8ee9c26e9bc26a653a312dc8a85326f3c06141c94c44daf12050"} Dec 06 06:12:27 crc kubenswrapper[4733]: I1206 06:12:27.019882 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" event={"ID":"e2a8649e-0504-47bc-8cea-c95c34f5e416","Type":"ContainerStarted","Data":"16db7efa79481491734140dc7991407b933c09624cbe152f4dc3e9e717d505c6"} Dec 06 06:12:27 crc kubenswrapper[4733]: I1206 06:12:27.036208 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" podStartSLOduration=1.548430847 podStartE2EDuration="2.036189022s" podCreationTimestamp="2025-12-06 06:12:25 +0000 UTC" firstStartedPulling="2025-12-06 06:12:25.846090997 +0000 UTC m=+1729.711302107" lastFinishedPulling="2025-12-06 06:12:26.333849171 +0000 UTC m=+1730.199060282" observedRunningTime="2025-12-06 06:12:27.031799599 +0000 UTC m=+1730.897010710" watchObservedRunningTime="2025-12-06 06:12:27.036189022 +0000 UTC m=+1730.901400134" Dec 06 06:12:28 crc kubenswrapper[4733]: I1206 06:12:28.044384 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8qbf"] Dec 06 06:12:28 crc kubenswrapper[4733]: I1206 06:12:28.054942 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8qbf"] Dec 06 06:12:28 crc kubenswrapper[4733]: I1206 06:12:28.495220 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0" path="/var/lib/kubelet/pods/42bf4ec1-7429-4efc-b83b-9d8b09ef3fc0/volumes" Dec 06 06:12:34 crc kubenswrapper[4733]: I1206 06:12:34.097345 4733 generic.go:334] "Generic (PLEG): container finished" podID="e2a8649e-0504-47bc-8cea-c95c34f5e416" containerID="16db7efa79481491734140dc7991407b933c09624cbe152f4dc3e9e717d505c6" exitCode=0 Dec 06 06:12:34 crc kubenswrapper[4733]: I1206 06:12:34.097430 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" event={"ID":"e2a8649e-0504-47bc-8cea-c95c34f5e416","Type":"ContainerDied","Data":"16db7efa79481491734140dc7991407b933c09624cbe152f4dc3e9e717d505c6"} Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.487093 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:12:35 crc kubenswrapper[4733]: E1206 06:12:35.487584 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.559405 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.710348 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key\") pod \"e2a8649e-0504-47bc-8cea-c95c34f5e416\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.710455 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory\") pod \"e2a8649e-0504-47bc-8cea-c95c34f5e416\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.710497 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzfhj\" (UniqueName: \"kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj\") pod \"e2a8649e-0504-47bc-8cea-c95c34f5e416\" (UID: \"e2a8649e-0504-47bc-8cea-c95c34f5e416\") " Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.717763 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj" (OuterVolumeSpecName: "kube-api-access-qzfhj") pod "e2a8649e-0504-47bc-8cea-c95c34f5e416" (UID: "e2a8649e-0504-47bc-8cea-c95c34f5e416"). InnerVolumeSpecName "kube-api-access-qzfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.740848 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2a8649e-0504-47bc-8cea-c95c34f5e416" (UID: "e2a8649e-0504-47bc-8cea-c95c34f5e416"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.741319 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory" (OuterVolumeSpecName: "inventory") pod "e2a8649e-0504-47bc-8cea-c95c34f5e416" (UID: "e2a8649e-0504-47bc-8cea-c95c34f5e416"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.813455 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.813493 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a8649e-0504-47bc-8cea-c95c34f5e416-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:35 crc kubenswrapper[4733]: I1206 06:12:35.813507 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzfhj\" (UniqueName: \"kubernetes.io/projected/e2a8649e-0504-47bc-8cea-c95c34f5e416-kube-api-access-qzfhj\") on node \"crc\" DevicePath \"\"" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.123668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" event={"ID":"e2a8649e-0504-47bc-8cea-c95c34f5e416","Type":"ContainerDied","Data":"97dee28e979a8ee9c26e9bc26a653a312dc8a85326f3c06141c94c44daf12050"} Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.123733 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97dee28e979a8ee9c26e9bc26a653a312dc8a85326f3c06141c94c44daf12050" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.123785 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.184151 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd"] Dec 06 06:12:36 crc kubenswrapper[4733]: E1206 06:12:36.184597 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a8649e-0504-47bc-8cea-c95c34f5e416" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.184616 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a8649e-0504-47bc-8cea-c95c34f5e416" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.184805 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a8649e-0504-47bc-8cea-c95c34f5e416" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.185446 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.187842 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.187976 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.188179 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.188445 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.188535 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.188620 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.190792 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.191113 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.195359 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd"] Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.325390 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.325522 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.325724 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.325833 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326157 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326213 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326499 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326757 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326849 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326894 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjk9\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.326954 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.327191 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.327426 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.327610 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429595 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429667 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjk9\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429767 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429805 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429881 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429916 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429955 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.429981 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.430889 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.430924 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.430970 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.437078 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.437202 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.437403 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.438205 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.438807 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.439264 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.439957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.440450 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.440680 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.441037 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.441119 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.441600 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.441744 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.446247 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjk9\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.511865 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.518642 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:12:36 crc kubenswrapper[4733]: I1206 06:12:36.986502 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd"] Dec 06 06:12:37 crc kubenswrapper[4733]: I1206 06:12:37.134766 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" event={"ID":"10ed3cae-fa08-4e62-af5f-e45711123cb3","Type":"ContainerStarted","Data":"13c2002b41c91d46414a0f3daae45ac26123b76e3c12b47f7b945fb263f4b070"} Dec 06 06:12:37 crc kubenswrapper[4733]: I1206 06:12:37.476821 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:12:38 crc kubenswrapper[4733]: I1206 06:12:38.146743 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" event={"ID":"10ed3cae-fa08-4e62-af5f-e45711123cb3","Type":"ContainerStarted","Data":"84bf1c8e005215af43c0da033cc2d984d046b68f06a5bed16ee08d9889d75c3d"} Dec 06 06:12:49 crc kubenswrapper[4733]: I1206 06:12:49.485989 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:12:49 crc kubenswrapper[4733]: E1206 06:12:49.487278 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:12:50 crc kubenswrapper[4733]: I1206 06:12:50.143532 4733 scope.go:117] "RemoveContainer" containerID="5754252d43db7758a87de8fa86da9214349415b1585fde5e89a95f9e0be4d7a3" Dec 06 06:13:00 crc kubenswrapper[4733]: I1206 06:13:00.485933 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:00 crc kubenswrapper[4733]: E1206 06:13:00.486871 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:13:04 crc kubenswrapper[4733]: I1206 06:13:04.388957 4733 generic.go:334] "Generic (PLEG): container finished" podID="10ed3cae-fa08-4e62-af5f-e45711123cb3" containerID="84bf1c8e005215af43c0da033cc2d984d046b68f06a5bed16ee08d9889d75c3d" exitCode=0 Dec 06 06:13:04 crc kubenswrapper[4733]: I1206 06:13:04.389075 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" event={"ID":"10ed3cae-fa08-4e62-af5f-e45711123cb3","Type":"ContainerDied","Data":"84bf1c8e005215af43c0da033cc2d984d046b68f06a5bed16ee08d9889d75c3d"} Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.763644 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852811 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852871 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852897 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852914 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852933 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852975 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.852997 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853016 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853036 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853056 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853077 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853114 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853136 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjk9\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.853150 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle\") pod \"10ed3cae-fa08-4e62-af5f-e45711123cb3\" (UID: \"10ed3cae-fa08-4e62-af5f-e45711123cb3\") " Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.858876 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861436 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861490 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861641 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861756 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861819 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861863 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9" (OuterVolumeSpecName: "kube-api-access-rcjk9") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "kube-api-access-rcjk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861924 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.862075 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.861800 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.862510 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.863569 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.878563 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.884455 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory" (OuterVolumeSpecName: "inventory") pod "10ed3cae-fa08-4e62-af5f-e45711123cb3" (UID: "10ed3cae-fa08-4e62-af5f-e45711123cb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956002 4733 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956292 4733 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956339 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956351 4733 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956362 4733 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956374 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956384 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956396 4733 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956407 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjk9\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-kube-api-access-rcjk9\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956418 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956429 4733 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956440 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956450 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ed3cae-fa08-4e62-af5f-e45711123cb3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:05 crc kubenswrapper[4733]: I1206 06:13:05.956460 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10ed3cae-fa08-4e62-af5f-e45711123cb3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.411687 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" event={"ID":"10ed3cae-fa08-4e62-af5f-e45711123cb3","Type":"ContainerDied","Data":"13c2002b41c91d46414a0f3daae45ac26123b76e3c12b47f7b945fb263f4b070"} Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.411758 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c2002b41c91d46414a0f3daae45ac26123b76e3c12b47f7b945fb263f4b070" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.411799 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.494950 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q"] Dec 06 06:13:06 crc kubenswrapper[4733]: E1206 06:13:06.495260 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ed3cae-fa08-4e62-af5f-e45711123cb3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.495281 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ed3cae-fa08-4e62-af5f-e45711123cb3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.495488 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ed3cae-fa08-4e62-af5f-e45711123cb3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.496154 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.498361 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.498671 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.498815 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.498945 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.499062 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.503199 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q"] Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.568745 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.568893 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.569512 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.569673 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7gk\" (UniqueName: \"kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.569809 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.672018 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.672110 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7gk\" (UniqueName: \"kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.672162 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.672251 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.672559 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.673647 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.677825 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.678086 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.678787 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.688091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7gk\" (UniqueName: \"kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2z74q\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:06 crc kubenswrapper[4733]: I1206 06:13:06.818612 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:07 crc kubenswrapper[4733]: I1206 06:13:07.311774 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q"] Dec 06 06:13:07 crc kubenswrapper[4733]: I1206 06:13:07.421036 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" event={"ID":"a8617856-6710-492d-9bfd-8acd53e89b30","Type":"ContainerStarted","Data":"bf8a6e03aca3ec9a9755b6dfa57398fde0998e93d526acca8016ff8c92d65f28"} Dec 06 06:13:08 crc kubenswrapper[4733]: I1206 06:13:08.433033 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" event={"ID":"a8617856-6710-492d-9bfd-8acd53e89b30","Type":"ContainerStarted","Data":"126aa41be443770af40ff9c53986d8c8bbf8ace94fe3eff350afc5f27dcbcbf8"} Dec 06 06:13:08 crc kubenswrapper[4733]: I1206 06:13:08.458027 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" podStartSLOduration=1.956237621 podStartE2EDuration="2.458000479s" podCreationTimestamp="2025-12-06 06:13:06 +0000 UTC" firstStartedPulling="2025-12-06 06:13:07.316795774 +0000 UTC m=+1771.182006885" lastFinishedPulling="2025-12-06 06:13:07.818558632 +0000 UTC m=+1771.683769743" observedRunningTime="2025-12-06 06:13:08.455156963 +0000 UTC m=+1772.320368074" watchObservedRunningTime="2025-12-06 06:13:08.458000479 +0000 UTC m=+1772.323211590" Dec 06 06:13:14 crc kubenswrapper[4733]: I1206 06:13:14.484889 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:14 crc kubenswrapper[4733]: E1206 06:13:14.485904 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:13:25 crc kubenswrapper[4733]: I1206 06:13:25.485119 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:25 crc kubenswrapper[4733]: E1206 06:13:25.486243 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:13:37 crc kubenswrapper[4733]: I1206 06:13:37.485780 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:37 crc kubenswrapper[4733]: E1206 06:13:37.486755 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:13:48 crc kubenswrapper[4733]: I1206 06:13:48.485203 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:48 crc kubenswrapper[4733]: E1206 06:13:48.486588 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:13:53 crc kubenswrapper[4733]: I1206 06:13:53.876152 4733 generic.go:334] "Generic (PLEG): container finished" podID="a8617856-6710-492d-9bfd-8acd53e89b30" containerID="126aa41be443770af40ff9c53986d8c8bbf8ace94fe3eff350afc5f27dcbcbf8" exitCode=0 Dec 06 06:13:53 crc kubenswrapper[4733]: I1206 06:13:53.876250 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" event={"ID":"a8617856-6710-492d-9bfd-8acd53e89b30","Type":"ContainerDied","Data":"126aa41be443770af40ff9c53986d8c8bbf8ace94fe3eff350afc5f27dcbcbf8"} Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.230328 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.408622 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory\") pod \"a8617856-6710-492d-9bfd-8acd53e89b30\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.408791 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv7gk\" (UniqueName: \"kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk\") pod \"a8617856-6710-492d-9bfd-8acd53e89b30\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.408851 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key\") pod \"a8617856-6710-492d-9bfd-8acd53e89b30\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.408956 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0\") pod \"a8617856-6710-492d-9bfd-8acd53e89b30\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.409066 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle\") pod \"a8617856-6710-492d-9bfd-8acd53e89b30\" (UID: \"a8617856-6710-492d-9bfd-8acd53e89b30\") " Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.419888 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk" (OuterVolumeSpecName: "kube-api-access-sv7gk") pod "a8617856-6710-492d-9bfd-8acd53e89b30" (UID: "a8617856-6710-492d-9bfd-8acd53e89b30"). InnerVolumeSpecName "kube-api-access-sv7gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.422061 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a8617856-6710-492d-9bfd-8acd53e89b30" (UID: "a8617856-6710-492d-9bfd-8acd53e89b30"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.435293 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a8617856-6710-492d-9bfd-8acd53e89b30" (UID: "a8617856-6710-492d-9bfd-8acd53e89b30"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.436949 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a8617856-6710-492d-9bfd-8acd53e89b30" (UID: "a8617856-6710-492d-9bfd-8acd53e89b30"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.439635 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory" (OuterVolumeSpecName: "inventory") pod "a8617856-6710-492d-9bfd-8acd53e89b30" (UID: "a8617856-6710-492d-9bfd-8acd53e89b30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.510856 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv7gk\" (UniqueName: \"kubernetes.io/projected/a8617856-6710-492d-9bfd-8acd53e89b30-kube-api-access-sv7gk\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.510891 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.510902 4733 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a8617856-6710-492d-9bfd-8acd53e89b30-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.510914 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.510923 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8617856-6710-492d-9bfd-8acd53e89b30-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.897709 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" event={"ID":"a8617856-6710-492d-9bfd-8acd53e89b30","Type":"ContainerDied","Data":"bf8a6e03aca3ec9a9755b6dfa57398fde0998e93d526acca8016ff8c92d65f28"} Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.897778 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2z74q" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.897785 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf8a6e03aca3ec9a9755b6dfa57398fde0998e93d526acca8016ff8c92d65f28" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.966135 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc"] Dec 06 06:13:55 crc kubenswrapper[4733]: E1206 06:13:55.966577 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8617856-6710-492d-9bfd-8acd53e89b30" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.966601 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8617856-6710-492d-9bfd-8acd53e89b30" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.966743 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8617856-6710-492d-9bfd-8acd53e89b30" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.967366 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.970222 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.970408 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.970638 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.971089 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.971210 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.971348 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 06 06:13:55 crc kubenswrapper[4733]: I1206 06:13:55.973761 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc"] Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.020606 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.020802 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.021014 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.021271 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxns\" (UniqueName: \"kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.021351 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.021373 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.123231 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.123341 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxns\" (UniqueName: \"kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.123380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.123402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.124159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.124214 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.128732 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.128850 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.130100 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.130484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.131052 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.138626 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxns\" (UniqueName: \"kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.281773 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.754777 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc"] Dec 06 06:13:56 crc kubenswrapper[4733]: I1206 06:13:56.907079 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" event={"ID":"868cd7d4-8d73-4344-a16d-c4975b6d9249","Type":"ContainerStarted","Data":"541729dff8b619133367b3a486887a3332dfc627d578690c1b8f1a124f1a28b8"} Dec 06 06:13:57 crc kubenswrapper[4733]: I1206 06:13:57.918713 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" event={"ID":"868cd7d4-8d73-4344-a16d-c4975b6d9249","Type":"ContainerStarted","Data":"41c8f4d5908ee552263c9bed67ef812c667c1179f3e857cb48b2ac3bd89b5cc1"} Dec 06 06:13:57 crc kubenswrapper[4733]: I1206 06:13:57.947216 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" podStartSLOduration=2.477430977 podStartE2EDuration="2.947197483s" podCreationTimestamp="2025-12-06 06:13:55 +0000 UTC" firstStartedPulling="2025-12-06 06:13:56.757362171 +0000 UTC m=+1820.622573283" lastFinishedPulling="2025-12-06 06:13:57.227128678 +0000 UTC m=+1821.092339789" observedRunningTime="2025-12-06 06:13:57.937018124 +0000 UTC m=+1821.802229234" watchObservedRunningTime="2025-12-06 06:13:57.947197483 +0000 UTC m=+1821.812408594" Dec 06 06:13:59 crc kubenswrapper[4733]: I1206 06:13:59.485147 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:13:59 crc kubenswrapper[4733]: E1206 06:13:59.485585 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:14:10 crc kubenswrapper[4733]: I1206 06:14:10.485523 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:14:10 crc kubenswrapper[4733]: E1206 06:14:10.486603 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:14:21 crc kubenswrapper[4733]: I1206 06:14:21.485327 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:14:21 crc kubenswrapper[4733]: E1206 06:14:21.486225 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:14:30 crc kubenswrapper[4733]: I1206 06:14:30.227665 4733 generic.go:334] "Generic (PLEG): container finished" podID="868cd7d4-8d73-4344-a16d-c4975b6d9249" containerID="41c8f4d5908ee552263c9bed67ef812c667c1179f3e857cb48b2ac3bd89b5cc1" exitCode=0 Dec 06 06:14:30 crc kubenswrapper[4733]: I1206 06:14:30.227761 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" event={"ID":"868cd7d4-8d73-4344-a16d-c4975b6d9249","Type":"ContainerDied","Data":"41c8f4d5908ee552263c9bed67ef812c667c1179f3e857cb48b2ac3bd89b5cc1"} Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.575100 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.751083 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.751996 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njxns\" (UniqueName: \"kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.752035 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.752166 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.752220 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.752248 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle\") pod \"868cd7d4-8d73-4344-a16d-c4975b6d9249\" (UID: \"868cd7d4-8d73-4344-a16d-c4975b6d9249\") " Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.759393 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.768480 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns" (OuterVolumeSpecName: "kube-api-access-njxns") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "kube-api-access-njxns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.783732 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory" (OuterVolumeSpecName: "inventory") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.784087 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.784245 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.786820 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "868cd7d4-8d73-4344-a16d-c4975b6d9249" (UID: "868cd7d4-8d73-4344-a16d-c4975b6d9249"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854691 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854721 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njxns\" (UniqueName: \"kubernetes.io/projected/868cd7d4-8d73-4344-a16d-c4975b6d9249-kube-api-access-njxns\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854733 4733 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854747 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854757 4733 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:31 crc kubenswrapper[4733]: I1206 06:14:31.854766 4733 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868cd7d4-8d73-4344-a16d-c4975b6d9249-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.248108 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" event={"ID":"868cd7d4-8d73-4344-a16d-c4975b6d9249","Type":"ContainerDied","Data":"541729dff8b619133367b3a486887a3332dfc627d578690c1b8f1a124f1a28b8"} Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.248511 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="541729dff8b619133367b3a486887a3332dfc627d578690c1b8f1a124f1a28b8" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.248189 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.337159 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52"] Dec 06 06:14:32 crc kubenswrapper[4733]: E1206 06:14:32.337672 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868cd7d4-8d73-4344-a16d-c4975b6d9249" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.337696 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="868cd7d4-8d73-4344-a16d-c4975b6d9249" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.337924 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="868cd7d4-8d73-4344-a16d-c4975b6d9249" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.339221 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.340957 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.341441 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.341840 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.342963 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.342982 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.349360 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52"] Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.465349 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.465442 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptgb\" (UniqueName: \"kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.465709 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.465902 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.465993 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.568243 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.568425 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.568469 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.568582 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.568658 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptgb\" (UniqueName: \"kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.573240 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.573317 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.573828 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.575181 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.585405 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptgb\" (UniqueName: \"kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ffs52\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:32 crc kubenswrapper[4733]: I1206 06:14:32.655858 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:14:33 crc kubenswrapper[4733]: I1206 06:14:33.145435 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52"] Dec 06 06:14:33 crc kubenswrapper[4733]: I1206 06:14:33.151719 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:14:33 crc kubenswrapper[4733]: I1206 06:14:33.261351 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" event={"ID":"b0eeb4fd-32c5-425a-b938-49572817e476","Type":"ContainerStarted","Data":"fc68c6d7c59ee8bf6d502d9078da26c86e58e9328953023c53c4e851e07b8e97"} Dec 06 06:14:34 crc kubenswrapper[4733]: I1206 06:14:34.270589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" event={"ID":"b0eeb4fd-32c5-425a-b938-49572817e476","Type":"ContainerStarted","Data":"d726adad68e9443e3d7e0bfb9c924e86df6c7e1261424ffb95de157749f7732b"} Dec 06 06:14:34 crc kubenswrapper[4733]: I1206 06:14:34.288717 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" podStartSLOduration=1.809998561 podStartE2EDuration="2.288700438s" podCreationTimestamp="2025-12-06 06:14:32 +0000 UTC" firstStartedPulling="2025-12-06 06:14:33.151485726 +0000 UTC m=+1857.016696837" lastFinishedPulling="2025-12-06 06:14:33.630187603 +0000 UTC m=+1857.495398714" observedRunningTime="2025-12-06 06:14:34.28305781 +0000 UTC m=+1858.148268920" watchObservedRunningTime="2025-12-06 06:14:34.288700438 +0000 UTC m=+1858.153911549" Dec 06 06:14:34 crc kubenswrapper[4733]: I1206 06:14:34.485187 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:14:34 crc kubenswrapper[4733]: E1206 06:14:34.485707 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:14:47 crc kubenswrapper[4733]: I1206 06:14:47.485075 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:14:47 crc kubenswrapper[4733]: E1206 06:14:47.486003 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.148015 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb"] Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.149816 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.153408 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.153571 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.160489 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb"] Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.255586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqr5\" (UniqueName: \"kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.255824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.255899 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.360452 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqr5\" (UniqueName: \"kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.360575 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.360623 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.361841 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.368709 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.375437 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqr5\" (UniqueName: \"kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5\") pod \"collect-profiles-29416695-x9rjb\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.475750 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:00 crc kubenswrapper[4733]: I1206 06:15:00.911504 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb"] Dec 06 06:15:01 crc kubenswrapper[4733]: I1206 06:15:01.484836 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:15:01 crc kubenswrapper[4733]: E1206 06:15:01.485275 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:15:01 crc kubenswrapper[4733]: I1206 06:15:01.542344 4733 generic.go:334] "Generic (PLEG): container finished" podID="47aba12b-ad2c-4e7e-b386-bb25fec8495c" containerID="531c652887b9f74ce0ee900037645976f8b97ef59ef52c40bb36176d96147826" exitCode=0 Dec 06 06:15:01 crc kubenswrapper[4733]: I1206 06:15:01.542660 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" event={"ID":"47aba12b-ad2c-4e7e-b386-bb25fec8495c","Type":"ContainerDied","Data":"531c652887b9f74ce0ee900037645976f8b97ef59ef52c40bb36176d96147826"} Dec 06 06:15:01 crc kubenswrapper[4733]: I1206 06:15:01.542696 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" event={"ID":"47aba12b-ad2c-4e7e-b386-bb25fec8495c","Type":"ContainerStarted","Data":"2fab3b5d6187f7f5e46fe824f37f67631f66dc9b37347ad952def3d26a7e7acb"} Dec 06 06:15:02 crc kubenswrapper[4733]: I1206 06:15:02.831692 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.033915 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume\") pod \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.034293 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqr5\" (UniqueName: \"kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5\") pod \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.034474 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume\") pod \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\" (UID: \"47aba12b-ad2c-4e7e-b386-bb25fec8495c\") " Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.035153 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume" (OuterVolumeSpecName: "config-volume") pod "47aba12b-ad2c-4e7e-b386-bb25fec8495c" (UID: "47aba12b-ad2c-4e7e-b386-bb25fec8495c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.041356 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47aba12b-ad2c-4e7e-b386-bb25fec8495c" (UID: "47aba12b-ad2c-4e7e-b386-bb25fec8495c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.041543 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5" (OuterVolumeSpecName: "kube-api-access-tjqr5") pod "47aba12b-ad2c-4e7e-b386-bb25fec8495c" (UID: "47aba12b-ad2c-4e7e-b386-bb25fec8495c"). InnerVolumeSpecName "kube-api-access-tjqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.137922 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47aba12b-ad2c-4e7e-b386-bb25fec8495c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.137965 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqr5\" (UniqueName: \"kubernetes.io/projected/47aba12b-ad2c-4e7e-b386-bb25fec8495c-kube-api-access-tjqr5\") on node \"crc\" DevicePath \"\"" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.137977 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47aba12b-ad2c-4e7e-b386-bb25fec8495c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.563568 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" event={"ID":"47aba12b-ad2c-4e7e-b386-bb25fec8495c","Type":"ContainerDied","Data":"2fab3b5d6187f7f5e46fe824f37f67631f66dc9b37347ad952def3d26a7e7acb"} Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.563632 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fab3b5d6187f7f5e46fe824f37f67631f66dc9b37347ad952def3d26a7e7acb" Dec 06 06:15:03 crc kubenswrapper[4733]: I1206 06:15:03.563658 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416695-x9rjb" Dec 06 06:15:16 crc kubenswrapper[4733]: I1206 06:15:16.490024 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:15:16 crc kubenswrapper[4733]: E1206 06:15:16.491271 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:15:28 crc kubenswrapper[4733]: I1206 06:15:28.485575 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:15:28 crc kubenswrapper[4733]: E1206 06:15:28.486551 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:15:42 crc kubenswrapper[4733]: I1206 06:15:42.485480 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:15:42 crc kubenswrapper[4733]: E1206 06:15:42.487671 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:15:54 crc kubenswrapper[4733]: I1206 06:15:54.484483 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:15:54 crc kubenswrapper[4733]: E1206 06:15:54.485411 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:16:05 crc kubenswrapper[4733]: I1206 06:16:05.484727 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:16:05 crc kubenswrapper[4733]: E1206 06:16:05.485530 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:16:19 crc kubenswrapper[4733]: I1206 06:16:19.485477 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:16:19 crc kubenswrapper[4733]: E1206 06:16:19.487177 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:16:34 crc kubenswrapper[4733]: I1206 06:16:34.486025 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:16:34 crc kubenswrapper[4733]: E1206 06:16:34.487702 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:16:47 crc kubenswrapper[4733]: I1206 06:16:47.484441 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:16:48 crc kubenswrapper[4733]: I1206 06:16:48.462663 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e"} Dec 06 06:17:23 crc kubenswrapper[4733]: I1206 06:17:23.767453 4733 generic.go:334] "Generic (PLEG): container finished" podID="b0eeb4fd-32c5-425a-b938-49572817e476" containerID="d726adad68e9443e3d7e0bfb9c924e86df6c7e1261424ffb95de157749f7732b" exitCode=0 Dec 06 06:17:23 crc kubenswrapper[4733]: I1206 06:17:23.767558 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" event={"ID":"b0eeb4fd-32c5-425a-b938-49572817e476","Type":"ContainerDied","Data":"d726adad68e9443e3d7e0bfb9c924e86df6c7e1261424ffb95de157749f7732b"} Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.115357 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.256814 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.256912 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.256969 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.256995 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.257029 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptgb\" (UniqueName: \"kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.263909 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.263987 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb" (OuterVolumeSpecName: "kube-api-access-8ptgb") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476"). InnerVolumeSpecName "kube-api-access-8ptgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:17:25 crc kubenswrapper[4733]: E1206 06:17:25.280921 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key podName:b0eeb4fd-32c5-425a-b938-49572817e476 nodeName:}" failed. No retries permitted until 2025-12-06 06:17:25.78089474 +0000 UTC m=+2029.646105851 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476") : error deleting /var/lib/kubelet/pods/b0eeb4fd-32c5-425a-b938-49572817e476/volume-subpaths: remove /var/lib/kubelet/pods/b0eeb4fd-32c5-425a-b938-49572817e476/volume-subpaths: no such file or directory Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.283165 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory" (OuterVolumeSpecName: "inventory") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.284151 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.360158 4733 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.361008 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ptgb\" (UniqueName: \"kubernetes.io/projected/b0eeb4fd-32c5-425a-b938-49572817e476-kube-api-access-8ptgb\") on node \"crc\" DevicePath \"\"" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.361026 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.361038 4733 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.786048 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" event={"ID":"b0eeb4fd-32c5-425a-b938-49572817e476","Type":"ContainerDied","Data":"fc68c6d7c59ee8bf6d502d9078da26c86e58e9328953023c53c4e851e07b8e97"} Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.786102 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc68c6d7c59ee8bf6d502d9078da26c86e58e9328953023c53c4e851e07b8e97" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.786098 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ffs52" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.860029 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8"] Dec 06 06:17:25 crc kubenswrapper[4733]: E1206 06:17:25.860410 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aba12b-ad2c-4e7e-b386-bb25fec8495c" containerName="collect-profiles" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.860427 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aba12b-ad2c-4e7e-b386-bb25fec8495c" containerName="collect-profiles" Dec 06 06:17:25 crc kubenswrapper[4733]: E1206 06:17:25.860451 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0eeb4fd-32c5-425a-b938-49572817e476" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.860459 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0eeb4fd-32c5-425a-b938-49572817e476" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.860628 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0eeb4fd-32c5-425a-b938-49572817e476" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.860648 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aba12b-ad2c-4e7e-b386-bb25fec8495c" containerName="collect-profiles" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.861260 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.864400 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.864400 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.864781 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.866840 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8"] Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870085 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") pod \"b0eeb4fd-32c5-425a-b938-49572817e476\" (UID: \"b0eeb4fd-32c5-425a-b938-49572817e476\") " Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870340 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870452 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870521 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870571 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870632 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870687 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870754 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870793 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.870836 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjxn\" (UniqueName: \"kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.874753 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0eeb4fd-32c5-425a-b938-49572817e476" (UID: "b0eeb4fd-32c5-425a-b938-49572817e476"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972110 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972203 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972446 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972537 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972640 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972749 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.972868 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.973054 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjxn\" (UniqueName: \"kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.973155 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.973404 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.973610 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0eeb4fd-32c5-425a-b938-49572817e476-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.976930 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.978067 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.978077 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.978787 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.978966 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.979450 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.979730 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:25 crc kubenswrapper[4733]: I1206 06:17:25.988100 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjxn\" (UniqueName: \"kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q5gb8\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:26 crc kubenswrapper[4733]: I1206 06:17:26.208371 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:17:26 crc kubenswrapper[4733]: I1206 06:17:26.680846 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8"] Dec 06 06:17:26 crc kubenswrapper[4733]: I1206 06:17:26.797532 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" event={"ID":"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7","Type":"ContainerStarted","Data":"1139ad8afff3eb495d06e22987d33f7aa5aec638fa0478af05b3141956095b0c"} Dec 06 06:17:27 crc kubenswrapper[4733]: I1206 06:17:27.805973 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" event={"ID":"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7","Type":"ContainerStarted","Data":"4e8d0a753f8a1926b88398ae80c4db70ad29764fb639b9baeb14eff618b4ef85"} Dec 06 06:17:27 crc kubenswrapper[4733]: I1206 06:17:27.827223 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" podStartSLOduration=2.3513837 podStartE2EDuration="2.827206791s" podCreationTimestamp="2025-12-06 06:17:25 +0000 UTC" firstStartedPulling="2025-12-06 06:17:26.685746732 +0000 UTC m=+2030.550957843" lastFinishedPulling="2025-12-06 06:17:27.161569823 +0000 UTC m=+2031.026780934" observedRunningTime="2025-12-06 06:17:27.82487705 +0000 UTC m=+2031.690088161" watchObservedRunningTime="2025-12-06 06:17:27.827206791 +0000 UTC m=+2031.692417902" Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.851387 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.853862 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.862540 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.913679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.913784 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zs4w\" (UniqueName: \"kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:46 crc kubenswrapper[4733]: I1206 06:18:46.913872 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.015710 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.015794 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.015867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zs4w\" (UniqueName: \"kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.016183 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.016246 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.032980 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zs4w\" (UniqueName: \"kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w\") pod \"community-operators-vmphp\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.174716 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:47 crc kubenswrapper[4733]: I1206 06:18:47.662354 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:18:48 crc kubenswrapper[4733]: I1206 06:18:48.536884 4733 generic.go:334] "Generic (PLEG): container finished" podID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerID="9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578" exitCode=0 Dec 06 06:18:48 crc kubenswrapper[4733]: I1206 06:18:48.537054 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerDied","Data":"9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578"} Dec 06 06:18:48 crc kubenswrapper[4733]: I1206 06:18:48.537324 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerStarted","Data":"30007484329b0fe8b78d8a1610c9e9c5edc7ebc0f271caa598e03a361959a825"} Dec 06 06:18:49 crc kubenswrapper[4733]: I1206 06:18:49.548797 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerStarted","Data":"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad"} Dec 06 06:18:50 crc kubenswrapper[4733]: I1206 06:18:50.559069 4733 generic.go:334] "Generic (PLEG): container finished" podID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerID="854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad" exitCode=0 Dec 06 06:18:50 crc kubenswrapper[4733]: I1206 06:18:50.559148 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerDied","Data":"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad"} Dec 06 06:18:51 crc kubenswrapper[4733]: I1206 06:18:51.572263 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerStarted","Data":"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d"} Dec 06 06:18:51 crc kubenswrapper[4733]: I1206 06:18:51.593163 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmphp" podStartSLOduration=3.09417765 podStartE2EDuration="5.593143928s" podCreationTimestamp="2025-12-06 06:18:46 +0000 UTC" firstStartedPulling="2025-12-06 06:18:48.538447899 +0000 UTC m=+2112.403659010" lastFinishedPulling="2025-12-06 06:18:51.037414177 +0000 UTC m=+2114.902625288" observedRunningTime="2025-12-06 06:18:51.586619191 +0000 UTC m=+2115.451830302" watchObservedRunningTime="2025-12-06 06:18:51.593143928 +0000 UTC m=+2115.458355040" Dec 06 06:18:57 crc kubenswrapper[4733]: I1206 06:18:57.175178 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:57 crc kubenswrapper[4733]: I1206 06:18:57.175835 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:57 crc kubenswrapper[4733]: I1206 06:18:57.214989 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:57 crc kubenswrapper[4733]: I1206 06:18:57.659725 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:18:57 crc kubenswrapper[4733]: I1206 06:18:57.713558 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:18:59 crc kubenswrapper[4733]: I1206 06:18:59.634011 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vmphp" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="registry-server" containerID="cri-o://6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d" gracePeriod=2 Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.034543 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.075319 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zs4w\" (UniqueName: \"kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w\") pod \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.075362 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities\") pod \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.075442 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content\") pod \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\" (UID: \"ac291d28-5eec-4e33-9fb2-c246e41ffc12\") " Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.076109 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities" (OuterVolumeSpecName: "utilities") pod "ac291d28-5eec-4e33-9fb2-c246e41ffc12" (UID: "ac291d28-5eec-4e33-9fb2-c246e41ffc12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.082096 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w" (OuterVolumeSpecName: "kube-api-access-7zs4w") pod "ac291d28-5eec-4e33-9fb2-c246e41ffc12" (UID: "ac291d28-5eec-4e33-9fb2-c246e41ffc12"). InnerVolumeSpecName "kube-api-access-7zs4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.118226 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac291d28-5eec-4e33-9fb2-c246e41ffc12" (UID: "ac291d28-5eec-4e33-9fb2-c246e41ffc12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.179090 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zs4w\" (UniqueName: \"kubernetes.io/projected/ac291d28-5eec-4e33-9fb2-c246e41ffc12-kube-api-access-7zs4w\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.179136 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.179153 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac291d28-5eec-4e33-9fb2-c246e41ffc12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.646833 4733 generic.go:334] "Generic (PLEG): container finished" podID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerID="6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d" exitCode=0 Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.646898 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerDied","Data":"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d"} Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.646928 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmphp" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.646953 4733 scope.go:117] "RemoveContainer" containerID="6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.646940 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmphp" event={"ID":"ac291d28-5eec-4e33-9fb2-c246e41ffc12","Type":"ContainerDied","Data":"30007484329b0fe8b78d8a1610c9e9c5edc7ebc0f271caa598e03a361959a825"} Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.668735 4733 scope.go:117] "RemoveContainer" containerID="854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.673224 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.680259 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vmphp"] Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.688818 4733 scope.go:117] "RemoveContainer" containerID="9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.723366 4733 scope.go:117] "RemoveContainer" containerID="6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d" Dec 06 06:19:00 crc kubenswrapper[4733]: E1206 06:19:00.723754 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d\": container with ID starting with 6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d not found: ID does not exist" containerID="6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.723785 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d"} err="failed to get container status \"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d\": rpc error: code = NotFound desc = could not find container \"6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d\": container with ID starting with 6f3fa3ae12dce546f8171221d5701a762084664f3ddccc09db13f40b4dc7597d not found: ID does not exist" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.723806 4733 scope.go:117] "RemoveContainer" containerID="854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad" Dec 06 06:19:00 crc kubenswrapper[4733]: E1206 06:19:00.724055 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad\": container with ID starting with 854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad not found: ID does not exist" containerID="854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.724087 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad"} err="failed to get container status \"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad\": rpc error: code = NotFound desc = could not find container \"854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad\": container with ID starting with 854c154fb9bc36685ede686163cdc16ed03a6a5f6167838f418d98c226c053ad not found: ID does not exist" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.724110 4733 scope.go:117] "RemoveContainer" containerID="9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578" Dec 06 06:19:00 crc kubenswrapper[4733]: E1206 06:19:00.724361 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578\": container with ID starting with 9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578 not found: ID does not exist" containerID="9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578" Dec 06 06:19:00 crc kubenswrapper[4733]: I1206 06:19:00.724387 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578"} err="failed to get container status \"9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578\": rpc error: code = NotFound desc = could not find container \"9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578\": container with ID starting with 9e7885006fd70b25c2c7ad178e13c3f5dba6b9c5563aa99d7b4d14fdd2c8d578 not found: ID does not exist" Dec 06 06:19:02 crc kubenswrapper[4733]: I1206 06:19:02.494018 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" path="/var/lib/kubelet/pods/ac291d28-5eec-4e33-9fb2-c246e41ffc12/volumes" Dec 06 06:19:12 crc kubenswrapper[4733]: I1206 06:19:12.988997 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:19:12 crc kubenswrapper[4733]: I1206 06:19:12.990392 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:19:21 crc kubenswrapper[4733]: I1206 06:19:21.834273 4733 generic.go:334] "Generic (PLEG): container finished" podID="ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" containerID="4e8d0a753f8a1926b88398ae80c4db70ad29764fb639b9baeb14eff618b4ef85" exitCode=0 Dec 06 06:19:21 crc kubenswrapper[4733]: I1206 06:19:21.834880 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" event={"ID":"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7","Type":"ContainerDied","Data":"4e8d0a753f8a1926b88398ae80c4db70ad29764fb639b9baeb14eff618b4ef85"} Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.209782 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.348778 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.348866 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.348983 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349093 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349112 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349185 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349239 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjxn\" (UniqueName: \"kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349318 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.349407 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0\") pod \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\" (UID: \"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7\") " Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.357535 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.358120 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn" (OuterVolumeSpecName: "kube-api-access-7cjxn") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "kube-api-access-7cjxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.378265 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.380412 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.380536 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.381009 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.381578 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.382429 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.388461 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory" (OuterVolumeSpecName: "inventory") pod "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" (UID: "ee5d47d4-6f8e-45b9-ac60-208196cbb5d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452785 4733 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452816 4733 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452827 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452836 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjxn\" (UniqueName: \"kubernetes.io/projected/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-kube-api-access-7cjxn\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452846 4733 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452855 4733 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452863 4733 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452871 4733 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.452880 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee5d47d4-6f8e-45b9-ac60-208196cbb5d7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.853151 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" event={"ID":"ee5d47d4-6f8e-45b9-ac60-208196cbb5d7","Type":"ContainerDied","Data":"1139ad8afff3eb495d06e22987d33f7aa5aec638fa0478af05b3141956095b0c"} Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.853520 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1139ad8afff3eb495d06e22987d33f7aa5aec638fa0478af05b3141956095b0c" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.853215 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q5gb8" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.926487 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6"] Dec 06 06:19:23 crc kubenswrapper[4733]: E1206 06:19:23.926866 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="registry-server" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.926885 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="registry-server" Dec 06 06:19:23 crc kubenswrapper[4733]: E1206 06:19:23.926900 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="extract-utilities" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.926907 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="extract-utilities" Dec 06 06:19:23 crc kubenswrapper[4733]: E1206 06:19:23.926919 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.932712 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 06:19:23 crc kubenswrapper[4733]: E1206 06:19:23.932746 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="extract-content" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.932752 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="extract-content" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.932925 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5d47d4-6f8e-45b9-ac60-208196cbb5d7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.932942 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac291d28-5eec-4e33-9fb2-c246e41ffc12" containerName="registry-server" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.933470 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6"] Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.933563 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.940554 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.940871 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.941107 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.941380 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7jxr9" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.942053 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964012 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964058 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964105 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964174 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964213 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964274 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6phh\" (UniqueName: \"kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:23 crc kubenswrapper[4733]: I1206 06:19:23.964605 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.065865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.065957 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.065985 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.066023 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.066052 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.066071 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.066119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6phh\" (UniqueName: \"kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.070544 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.070621 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.071343 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.071643 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.072133 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.072234 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.083482 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6phh\" (UniqueName: \"kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.258423 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.708326 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6"] Dec 06 06:19:24 crc kubenswrapper[4733]: I1206 06:19:24.864866 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" event={"ID":"a0e43d41-4e58-4467-99d0-b782a2f2d65a","Type":"ContainerStarted","Data":"16963935e39607dfb8f9a817c1507d9d19280a3e2fb362ac76c2582720c0bdfb"} Dec 06 06:19:25 crc kubenswrapper[4733]: I1206 06:19:25.874028 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" event={"ID":"a0e43d41-4e58-4467-99d0-b782a2f2d65a","Type":"ContainerStarted","Data":"603291c8206cf6bcc8b3028269c9f18015c5416352b291e57edc65520e5ab60e"} Dec 06 06:19:25 crc kubenswrapper[4733]: I1206 06:19:25.900579 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" podStartSLOduration=2.449188605 podStartE2EDuration="2.900559336s" podCreationTimestamp="2025-12-06 06:19:23 +0000 UTC" firstStartedPulling="2025-12-06 06:19:24.712181683 +0000 UTC m=+2148.577392794" lastFinishedPulling="2025-12-06 06:19:25.163552414 +0000 UTC m=+2149.028763525" observedRunningTime="2025-12-06 06:19:25.891990585 +0000 UTC m=+2149.757201697" watchObservedRunningTime="2025-12-06 06:19:25.900559336 +0000 UTC m=+2149.765770447" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.656500 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.659371 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.668645 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.733339 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.733411 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.733490 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx4r8\" (UniqueName: \"kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.835299 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.835374 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.835406 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx4r8\" (UniqueName: \"kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.835820 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.835867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.855836 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx4r8\" (UniqueName: \"kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8\") pod \"redhat-marketplace-tnhcx\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:32 crc kubenswrapper[4733]: I1206 06:19:32.978755 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:33 crc kubenswrapper[4733]: I1206 06:19:33.422931 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:33 crc kubenswrapper[4733]: I1206 06:19:33.937985 4733 generic.go:334] "Generic (PLEG): container finished" podID="d91f4edc-9389-432b-981b-a73c0f375d45" containerID="1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf" exitCode=0 Dec 06 06:19:33 crc kubenswrapper[4733]: I1206 06:19:33.938102 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerDied","Data":"1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf"} Dec 06 06:19:33 crc kubenswrapper[4733]: I1206 06:19:33.938396 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerStarted","Data":"24ac35d19f7e5c27ea35c0d68abbf889556dbe7ed10acd3ce14debd9fba8ab95"} Dec 06 06:19:33 crc kubenswrapper[4733]: I1206 06:19:33.942199 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:19:34 crc kubenswrapper[4733]: E1206 06:19:34.858659 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd91f4edc_9389_432b_981b_a73c0f375d45.slice/crio-conmon-0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255.scope\": RecentStats: unable to find data in memory cache]" Dec 06 06:19:34 crc kubenswrapper[4733]: I1206 06:19:34.949381 4733 generic.go:334] "Generic (PLEG): container finished" podID="d91f4edc-9389-432b-981b-a73c0f375d45" containerID="0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255" exitCode=0 Dec 06 06:19:34 crc kubenswrapper[4733]: I1206 06:19:34.949478 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerDied","Data":"0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255"} Dec 06 06:19:35 crc kubenswrapper[4733]: I1206 06:19:35.961589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerStarted","Data":"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2"} Dec 06 06:19:35 crc kubenswrapper[4733]: I1206 06:19:35.983812 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tnhcx" podStartSLOduration=2.510518801 podStartE2EDuration="3.983794557s" podCreationTimestamp="2025-12-06 06:19:32 +0000 UTC" firstStartedPulling="2025-12-06 06:19:33.941961396 +0000 UTC m=+2157.807172507" lastFinishedPulling="2025-12-06 06:19:35.415237151 +0000 UTC m=+2159.280448263" observedRunningTime="2025-12-06 06:19:35.979688618 +0000 UTC m=+2159.844899729" watchObservedRunningTime="2025-12-06 06:19:35.983794557 +0000 UTC m=+2159.849005669" Dec 06 06:19:42 crc kubenswrapper[4733]: I1206 06:19:42.979504 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:42 crc kubenswrapper[4733]: I1206 06:19:42.980019 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:42 crc kubenswrapper[4733]: I1206 06:19:42.989920 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:19:42 crc kubenswrapper[4733]: I1206 06:19:42.989978 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:19:43 crc kubenswrapper[4733]: I1206 06:19:43.013551 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:43 crc kubenswrapper[4733]: I1206 06:19:43.049293 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:43 crc kubenswrapper[4733]: I1206 06:19:43.244512 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.027722 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tnhcx" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="registry-server" containerID="cri-o://930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2" gracePeriod=2 Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.407025 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.457170 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx4r8\" (UniqueName: \"kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8\") pod \"d91f4edc-9389-432b-981b-a73c0f375d45\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.457216 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities\") pod \"d91f4edc-9389-432b-981b-a73c0f375d45\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.457238 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content\") pod \"d91f4edc-9389-432b-981b-a73c0f375d45\" (UID: \"d91f4edc-9389-432b-981b-a73c0f375d45\") " Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.458028 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities" (OuterVolumeSpecName: "utilities") pod "d91f4edc-9389-432b-981b-a73c0f375d45" (UID: "d91f4edc-9389-432b-981b-a73c0f375d45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.462136 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8" (OuterVolumeSpecName: "kube-api-access-lx4r8") pod "d91f4edc-9389-432b-981b-a73c0f375d45" (UID: "d91f4edc-9389-432b-981b-a73c0f375d45"). InnerVolumeSpecName "kube-api-access-lx4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.472947 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d91f4edc-9389-432b-981b-a73c0f375d45" (UID: "d91f4edc-9389-432b-981b-a73c0f375d45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.559753 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx4r8\" (UniqueName: \"kubernetes.io/projected/d91f4edc-9389-432b-981b-a73c0f375d45-kube-api-access-lx4r8\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.559943 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:45 crc kubenswrapper[4733]: I1206 06:19:45.560030 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91f4edc-9389-432b-981b-a73c0f375d45-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.038484 4733 generic.go:334] "Generic (PLEG): container finished" podID="d91f4edc-9389-432b-981b-a73c0f375d45" containerID="930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2" exitCode=0 Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.038535 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerDied","Data":"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2"} Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.038574 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnhcx" event={"ID":"d91f4edc-9389-432b-981b-a73c0f375d45","Type":"ContainerDied","Data":"24ac35d19f7e5c27ea35c0d68abbf889556dbe7ed10acd3ce14debd9fba8ab95"} Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.038594 4733 scope.go:117] "RemoveContainer" containerID="930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.039431 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnhcx" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.067897 4733 scope.go:117] "RemoveContainer" containerID="0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.083824 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.090495 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnhcx"] Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.101074 4733 scope.go:117] "RemoveContainer" containerID="1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.124152 4733 scope.go:117] "RemoveContainer" containerID="930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2" Dec 06 06:19:46 crc kubenswrapper[4733]: E1206 06:19:46.124638 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2\": container with ID starting with 930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2 not found: ID does not exist" containerID="930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.124681 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2"} err="failed to get container status \"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2\": rpc error: code = NotFound desc = could not find container \"930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2\": container with ID starting with 930902282f8aa4ac85aebc0367f0981c815180c07e64dcec459f90a592e0b3c2 not found: ID does not exist" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.124709 4733 scope.go:117] "RemoveContainer" containerID="0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255" Dec 06 06:19:46 crc kubenswrapper[4733]: E1206 06:19:46.125140 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255\": container with ID starting with 0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255 not found: ID does not exist" containerID="0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.125171 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255"} err="failed to get container status \"0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255\": rpc error: code = NotFound desc = could not find container \"0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255\": container with ID starting with 0c334c327b2dc6342b012edb4db27e829e2bbbef2ed4f6f97d63bafbeaa56255 not found: ID does not exist" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.125193 4733 scope.go:117] "RemoveContainer" containerID="1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf" Dec 06 06:19:46 crc kubenswrapper[4733]: E1206 06:19:46.125488 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf\": container with ID starting with 1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf not found: ID does not exist" containerID="1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.125510 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf"} err="failed to get container status \"1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf\": rpc error: code = NotFound desc = could not find container \"1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf\": container with ID starting with 1eafd9698be623af170e6d3886a1232287a71aa988a350e47b607ec0bf5dd6cf not found: ID does not exist" Dec 06 06:19:46 crc kubenswrapper[4733]: I1206 06:19:46.494568 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" path="/var/lib/kubelet/pods/d91f4edc-9389-432b-981b-a73c0f375d45/volumes" Dec 06 06:20:12 crc kubenswrapper[4733]: I1206 06:20:12.989358 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:20:12 crc kubenswrapper[4733]: I1206 06:20:12.990230 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:20:12 crc kubenswrapper[4733]: I1206 06:20:12.990315 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:20:12 crc kubenswrapper[4733]: I1206 06:20:12.991817 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:20:12 crc kubenswrapper[4733]: I1206 06:20:12.991902 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e" gracePeriod=600 Dec 06 06:20:13 crc kubenswrapper[4733]: I1206 06:20:13.250169 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e" exitCode=0 Dec 06 06:20:13 crc kubenswrapper[4733]: I1206 06:20:13.250249 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e"} Dec 06 06:20:13 crc kubenswrapper[4733]: I1206 06:20:13.250488 4733 scope.go:117] "RemoveContainer" containerID="95a5206d8977bdf896771b7495437ec92f1082c61e752b99e7ba75dda3bd2a35" Dec 06 06:20:14 crc kubenswrapper[4733]: I1206 06:20:14.261200 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84"} Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.256591 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:17 crc kubenswrapper[4733]: E1206 06:20:17.257759 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="extract-content" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.257773 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="extract-content" Dec 06 06:20:17 crc kubenswrapper[4733]: E1206 06:20:17.257791 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="extract-utilities" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.257797 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="extract-utilities" Dec 06 06:20:17 crc kubenswrapper[4733]: E1206 06:20:17.257812 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="registry-server" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.257818 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="registry-server" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.258015 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91f4edc-9389-432b-981b-a73c0f375d45" containerName="registry-server" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.259665 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.263453 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.341038 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.341102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqtvs\" (UniqueName: \"kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.341129 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.442861 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.442927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqtvs\" (UniqueName: \"kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.442953 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.443341 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.443348 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.463591 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqtvs\" (UniqueName: \"kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs\") pod \"certified-operators-b2bd8\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.577416 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:17 crc kubenswrapper[4733]: I1206 06:20:17.849065 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:18 crc kubenswrapper[4733]: I1206 06:20:18.294908 4733 generic.go:334] "Generic (PLEG): container finished" podID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerID="b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2" exitCode=0 Dec 06 06:20:18 crc kubenswrapper[4733]: I1206 06:20:18.295011 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerDied","Data":"b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2"} Dec 06 06:20:18 crc kubenswrapper[4733]: I1206 06:20:18.295220 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerStarted","Data":"fd38b921b42352536edcf3e1704ff4030346c5bbf920f3e6d72b7580da7c07af"} Dec 06 06:20:19 crc kubenswrapper[4733]: I1206 06:20:19.310838 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerStarted","Data":"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1"} Dec 06 06:20:20 crc kubenswrapper[4733]: I1206 06:20:20.322033 4733 generic.go:334] "Generic (PLEG): container finished" podID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerID="1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1" exitCode=0 Dec 06 06:20:20 crc kubenswrapper[4733]: I1206 06:20:20.322084 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerDied","Data":"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1"} Dec 06 06:20:21 crc kubenswrapper[4733]: I1206 06:20:21.332857 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerStarted","Data":"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750"} Dec 06 06:20:21 crc kubenswrapper[4733]: I1206 06:20:21.350399 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b2bd8" podStartSLOduration=1.740382387 podStartE2EDuration="4.350381347s" podCreationTimestamp="2025-12-06 06:20:17 +0000 UTC" firstStartedPulling="2025-12-06 06:20:18.296773428 +0000 UTC m=+2202.161984540" lastFinishedPulling="2025-12-06 06:20:20.906772389 +0000 UTC m=+2204.771983500" observedRunningTime="2025-12-06 06:20:21.346567958 +0000 UTC m=+2205.211779068" watchObservedRunningTime="2025-12-06 06:20:21.350381347 +0000 UTC m=+2205.215592459" Dec 06 06:20:27 crc kubenswrapper[4733]: I1206 06:20:27.577686 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:27 crc kubenswrapper[4733]: I1206 06:20:27.578294 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:27 crc kubenswrapper[4733]: I1206 06:20:27.678334 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:28 crc kubenswrapper[4733]: I1206 06:20:28.429013 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:28 crc kubenswrapper[4733]: I1206 06:20:28.480082 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.412563 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b2bd8" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="registry-server" containerID="cri-o://ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750" gracePeriod=2 Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.818747 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.848901 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content\") pod \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.849176 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqtvs\" (UniqueName: \"kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs\") pod \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.849466 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities\") pod \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\" (UID: \"268a0cc6-850b-4f97-9265-ff0b9c9ae930\") " Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.850176 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities" (OuterVolumeSpecName: "utilities") pod "268a0cc6-850b-4f97-9265-ff0b9c9ae930" (UID: "268a0cc6-850b-4f97-9265-ff0b9c9ae930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.850599 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.859203 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs" (OuterVolumeSpecName: "kube-api-access-cqtvs") pod "268a0cc6-850b-4f97-9265-ff0b9c9ae930" (UID: "268a0cc6-850b-4f97-9265-ff0b9c9ae930"). InnerVolumeSpecName "kube-api-access-cqtvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.889579 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "268a0cc6-850b-4f97-9265-ff0b9c9ae930" (UID: "268a0cc6-850b-4f97-9265-ff0b9c9ae930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.952737 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268a0cc6-850b-4f97-9265-ff0b9c9ae930-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:30 crc kubenswrapper[4733]: I1206 06:20:30.952775 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqtvs\" (UniqueName: \"kubernetes.io/projected/268a0cc6-850b-4f97-9265-ff0b9c9ae930-kube-api-access-cqtvs\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.424166 4733 generic.go:334] "Generic (PLEG): container finished" podID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerID="ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750" exitCode=0 Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.424256 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2bd8" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.424278 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerDied","Data":"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750"} Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.425838 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2bd8" event={"ID":"268a0cc6-850b-4f97-9265-ff0b9c9ae930","Type":"ContainerDied","Data":"fd38b921b42352536edcf3e1704ff4030346c5bbf920f3e6d72b7580da7c07af"} Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.425871 4733 scope.go:117] "RemoveContainer" containerID="ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.445646 4733 scope.go:117] "RemoveContainer" containerID="1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.458112 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.464940 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b2bd8"] Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.479678 4733 scope.go:117] "RemoveContainer" containerID="b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.510928 4733 scope.go:117] "RemoveContainer" containerID="ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750" Dec 06 06:20:31 crc kubenswrapper[4733]: E1206 06:20:31.511419 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750\": container with ID starting with ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750 not found: ID does not exist" containerID="ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.511454 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750"} err="failed to get container status \"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750\": rpc error: code = NotFound desc = could not find container \"ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750\": container with ID starting with ae8d583f0878f88807bcfbc7c9a5b25187bd91c2c96a8dcb456f5cf623a96750 not found: ID does not exist" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.511490 4733 scope.go:117] "RemoveContainer" containerID="1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1" Dec 06 06:20:31 crc kubenswrapper[4733]: E1206 06:20:31.511848 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1\": container with ID starting with 1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1 not found: ID does not exist" containerID="1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.511889 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1"} err="failed to get container status \"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1\": rpc error: code = NotFound desc = could not find container \"1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1\": container with ID starting with 1fc2a95f6bf08f6331609c5a1c1e9600cdfd98ebf6f8fad36cc9060e2e370ad1 not found: ID does not exist" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.511917 4733 scope.go:117] "RemoveContainer" containerID="b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2" Dec 06 06:20:31 crc kubenswrapper[4733]: E1206 06:20:31.512279 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2\": container with ID starting with b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2 not found: ID does not exist" containerID="b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2" Dec 06 06:20:31 crc kubenswrapper[4733]: I1206 06:20:31.512365 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2"} err="failed to get container status \"b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2\": rpc error: code = NotFound desc = could not find container \"b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2\": container with ID starting with b619e0fa81c51d50550c415617d1466a09656fdcc2c57b0f12e66d78537689c2 not found: ID does not exist" Dec 06 06:20:32 crc kubenswrapper[4733]: I1206 06:20:32.495397 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" path="/var/lib/kubelet/pods/268a0cc6-850b-4f97-9265-ff0b9c9ae930/volumes" Dec 06 06:20:56 crc kubenswrapper[4733]: I1206 06:20:56.634339 4733 generic.go:334] "Generic (PLEG): container finished" podID="a0e43d41-4e58-4467-99d0-b782a2f2d65a" containerID="603291c8206cf6bcc8b3028269c9f18015c5416352b291e57edc65520e5ab60e" exitCode=0 Dec 06 06:20:56 crc kubenswrapper[4733]: I1206 06:20:56.634402 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" event={"ID":"a0e43d41-4e58-4467-99d0-b782a2f2d65a","Type":"ContainerDied","Data":"603291c8206cf6bcc8b3028269c9f18015c5416352b291e57edc65520e5ab60e"} Dec 06 06:20:57 crc kubenswrapper[4733]: I1206 06:20:57.960516 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143648 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6phh\" (UniqueName: \"kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143705 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143746 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143780 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143854 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143875 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.143899 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1\") pod \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\" (UID: \"a0e43d41-4e58-4467-99d0-b782a2f2d65a\") " Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.149580 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh" (OuterVolumeSpecName: "kube-api-access-l6phh") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "kube-api-access-l6phh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.150156 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.170419 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.170496 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.171094 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.171443 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.171600 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory" (OuterVolumeSpecName: "inventory") pod "a0e43d41-4e58-4467-99d0-b782a2f2d65a" (UID: "a0e43d41-4e58-4467-99d0-b782a2f2d65a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.245814 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6phh\" (UniqueName: \"kubernetes.io/projected/a0e43d41-4e58-4467-99d0-b782a2f2d65a-kube-api-access-l6phh\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.245908 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.245966 4733 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.246043 4733 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.246108 4733 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.246160 4733 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.246209 4733 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a0e43d41-4e58-4467-99d0-b782a2f2d65a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.653472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" event={"ID":"a0e43d41-4e58-4467-99d0-b782a2f2d65a","Type":"ContainerDied","Data":"16963935e39607dfb8f9a817c1507d9d19280a3e2fb362ac76c2582720c0bdfb"} Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.653521 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16963935e39607dfb8f9a817c1507d9d19280a3e2fb362ac76c2582720c0bdfb" Dec 06 06:20:58 crc kubenswrapper[4733]: I1206 06:20:58.653535 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.190746 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:00 crc kubenswrapper[4733]: E1206 06:21:00.192461 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e43d41-4e58-4467-99d0-b782a2f2d65a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192486 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e43d41-4e58-4467-99d0-b782a2f2d65a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 06:21:00 crc kubenswrapper[4733]: E1206 06:21:00.192508 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="registry-server" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192516 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="registry-server" Dec 06 06:21:00 crc kubenswrapper[4733]: E1206 06:21:00.192534 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="extract-content" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192540 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="extract-content" Dec 06 06:21:00 crc kubenswrapper[4733]: E1206 06:21:00.192556 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="extract-utilities" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192565 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="extract-utilities" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192887 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="268a0cc6-850b-4f97-9265-ff0b9c9ae930" containerName="registry-server" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.192929 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e43d41-4e58-4467-99d0-b782a2f2d65a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.196569 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.201688 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.276813 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fz6p\" (UniqueName: \"kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.277009 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.277049 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.378710 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.378765 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.378867 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fz6p\" (UniqueName: \"kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.379246 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.379255 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.396585 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fz6p\" (UniqueName: \"kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p\") pod \"redhat-operators-nn22m\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.518539 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:00 crc kubenswrapper[4733]: I1206 06:21:00.921175 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:01 crc kubenswrapper[4733]: I1206 06:21:01.681556 4733 generic.go:334] "Generic (PLEG): container finished" podID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerID="e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7" exitCode=0 Dec 06 06:21:01 crc kubenswrapper[4733]: I1206 06:21:01.681672 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerDied","Data":"e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7"} Dec 06 06:21:01 crc kubenswrapper[4733]: I1206 06:21:01.681889 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerStarted","Data":"8ab30fee19b82c8baa3afdfd1934c53ab27dbcc668997a3dbfbed7a4ed42c20d"} Dec 06 06:21:02 crc kubenswrapper[4733]: I1206 06:21:02.693788 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerStarted","Data":"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae"} Dec 06 06:21:03 crc kubenswrapper[4733]: I1206 06:21:03.702824 4733 generic.go:334] "Generic (PLEG): container finished" podID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerID="66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae" exitCode=0 Dec 06 06:21:03 crc kubenswrapper[4733]: I1206 06:21:03.702919 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerDied","Data":"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae"} Dec 06 06:21:04 crc kubenswrapper[4733]: I1206 06:21:04.715569 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerStarted","Data":"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f"} Dec 06 06:21:04 crc kubenswrapper[4733]: I1206 06:21:04.738995 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nn22m" podStartSLOduration=2.2676252359999998 podStartE2EDuration="4.738983234s" podCreationTimestamp="2025-12-06 06:21:00 +0000 UTC" firstStartedPulling="2025-12-06 06:21:01.683519807 +0000 UTC m=+2245.548730919" lastFinishedPulling="2025-12-06 06:21:04.154877805 +0000 UTC m=+2248.020088917" observedRunningTime="2025-12-06 06:21:04.733590955 +0000 UTC m=+2248.598802067" watchObservedRunningTime="2025-12-06 06:21:04.738983234 +0000 UTC m=+2248.604194345" Dec 06 06:21:10 crc kubenswrapper[4733]: I1206 06:21:10.519713 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:10 crc kubenswrapper[4733]: I1206 06:21:10.520057 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:10 crc kubenswrapper[4733]: I1206 06:21:10.559341 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:10 crc kubenswrapper[4733]: I1206 06:21:10.807163 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:10 crc kubenswrapper[4733]: I1206 06:21:10.855892 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:12 crc kubenswrapper[4733]: I1206 06:21:12.789711 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nn22m" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="registry-server" containerID="cri-o://4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f" gracePeriod=2 Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.191961 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.236462 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fz6p\" (UniqueName: \"kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p\") pod \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.236724 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content\") pod \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.236927 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities\") pod \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\" (UID: \"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41\") " Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.237758 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities" (OuterVolumeSpecName: "utilities") pod "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" (UID: "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.246411 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p" (OuterVolumeSpecName: "kube-api-access-8fz6p") pod "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" (UID: "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41"). InnerVolumeSpecName "kube-api-access-8fz6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.328997 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" (UID: "e5f7c1b2-b396-4f63-8333-e7db6e8d9f41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.339150 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.339186 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.339199 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fz6p\" (UniqueName: \"kubernetes.io/projected/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41-kube-api-access-8fz6p\") on node \"crc\" DevicePath \"\"" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.802792 4733 generic.go:334] "Generic (PLEG): container finished" podID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerID="4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f" exitCode=0 Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.802861 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerDied","Data":"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f"} Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.803194 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nn22m" event={"ID":"e5f7c1b2-b396-4f63-8333-e7db6e8d9f41","Type":"ContainerDied","Data":"8ab30fee19b82c8baa3afdfd1934c53ab27dbcc668997a3dbfbed7a4ed42c20d"} Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.803229 4733 scope.go:117] "RemoveContainer" containerID="4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.802894 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nn22m" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.828903 4733 scope.go:117] "RemoveContainer" containerID="66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.837051 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.845060 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nn22m"] Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.862096 4733 scope.go:117] "RemoveContainer" containerID="e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.879979 4733 scope.go:117] "RemoveContainer" containerID="4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f" Dec 06 06:21:13 crc kubenswrapper[4733]: E1206 06:21:13.880462 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f\": container with ID starting with 4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f not found: ID does not exist" containerID="4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.880513 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f"} err="failed to get container status \"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f\": rpc error: code = NotFound desc = could not find container \"4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f\": container with ID starting with 4762c9673e0d30466089322b7081fe58f1015adb59fd8ee88195d13e64776a1f not found: ID does not exist" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.880545 4733 scope.go:117] "RemoveContainer" containerID="66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae" Dec 06 06:21:13 crc kubenswrapper[4733]: E1206 06:21:13.881061 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae\": container with ID starting with 66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae not found: ID does not exist" containerID="66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.881168 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae"} err="failed to get container status \"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae\": rpc error: code = NotFound desc = could not find container \"66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae\": container with ID starting with 66591cf9f3cc865ac98d3e6b5cba8e1e7b37305a97a61668357a20d780c3baae not found: ID does not exist" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.881249 4733 scope.go:117] "RemoveContainer" containerID="e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7" Dec 06 06:21:13 crc kubenswrapper[4733]: E1206 06:21:13.881645 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7\": container with ID starting with e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7 not found: ID does not exist" containerID="e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7" Dec 06 06:21:13 crc kubenswrapper[4733]: I1206 06:21:13.881739 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7"} err="failed to get container status \"e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7\": rpc error: code = NotFound desc = could not find container \"e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7\": container with ID starting with e8d68529c4a4ed97d84d4e974190a732571d9a73c4974c1d22b0adac40729ee7 not found: ID does not exist" Dec 06 06:21:14 crc kubenswrapper[4733]: I1206 06:21:14.494139 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" path="/var/lib/kubelet/pods/e5f7c1b2-b396-4f63-8333-e7db6e8d9f41/volumes" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.746487 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 06:21:35 crc kubenswrapper[4733]: E1206 06:21:35.748075 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="registry-server" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.748098 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="registry-server" Dec 06 06:21:35 crc kubenswrapper[4733]: E1206 06:21:35.748146 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="extract-utilities" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.748154 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="extract-utilities" Dec 06 06:21:35 crc kubenswrapper[4733]: E1206 06:21:35.748172 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="extract-content" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.748179 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="extract-content" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.748461 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f7c1b2-b396-4f63-8333-e7db6e8d9f41" containerName="registry-server" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.749586 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.752438 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.752504 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gz4r5" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.752924 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.754203 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.758127 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882314 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882366 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882443 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svrh\" (UniqueName: \"kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882506 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882603 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882668 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882735 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882837 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.882894 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.983777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984083 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984132 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984212 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984236 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984269 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984293 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984346 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svrh\" (UniqueName: \"kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984401 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.984977 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.985787 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.985874 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.986231 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.986808 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.991798 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.992422 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:35 crc kubenswrapper[4733]: I1206 06:21:35.993083 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:36 crc kubenswrapper[4733]: I1206 06:21:36.001501 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svrh\" (UniqueName: \"kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:36 crc kubenswrapper[4733]: I1206 06:21:36.013334 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " pod="openstack/tempest-tests-tempest" Dec 06 06:21:36 crc kubenswrapper[4733]: I1206 06:21:36.069239 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 06:21:36 crc kubenswrapper[4733]: I1206 06:21:36.484553 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 06:21:37 crc kubenswrapper[4733]: I1206 06:21:37.030668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"677c0cf0-716e-467c-ac8b-0cd446fb11ed","Type":"ContainerStarted","Data":"c6c4dcc05a635a0f9866008f7ae0a82008896c96885962263ff750ebc6af521e"} Dec 06 06:21:52 crc kubenswrapper[4733]: I1206 06:21:52.720519 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 06:21:54 crc kubenswrapper[4733]: I1206 06:21:54.216540 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"677c0cf0-716e-467c-ac8b-0cd446fb11ed","Type":"ContainerStarted","Data":"0f5bdb5ba39a3245104201cf3960d47cc685b5696f3fd7b9ea7807a9d03656e8"} Dec 06 06:21:54 crc kubenswrapper[4733]: I1206 06:21:54.244334 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.020371093 podStartE2EDuration="20.244292804s" podCreationTimestamp="2025-12-06 06:21:34 +0000 UTC" firstStartedPulling="2025-12-06 06:21:36.494406107 +0000 UTC m=+2280.359617217" lastFinishedPulling="2025-12-06 06:21:52.718327817 +0000 UTC m=+2296.583538928" observedRunningTime="2025-12-06 06:21:54.235144193 +0000 UTC m=+2298.100355305" watchObservedRunningTime="2025-12-06 06:21:54.244292804 +0000 UTC m=+2298.109503916" Dec 06 06:22:42 crc kubenswrapper[4733]: I1206 06:22:42.989035 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:22:42 crc kubenswrapper[4733]: I1206 06:22:42.989738 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:23:12 crc kubenswrapper[4733]: I1206 06:23:12.989249 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:23:12 crc kubenswrapper[4733]: I1206 06:23:12.990026 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:23:42 crc kubenswrapper[4733]: I1206 06:23:42.989125 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:23:42 crc kubenswrapper[4733]: I1206 06:23:42.989993 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:23:42 crc kubenswrapper[4733]: I1206 06:23:42.990055 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:23:42 crc kubenswrapper[4733]: I1206 06:23:42.990876 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:23:42 crc kubenswrapper[4733]: I1206 06:23:42.990935 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" gracePeriod=600 Dec 06 06:23:43 crc kubenswrapper[4733]: E1206 06:23:43.112521 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:23:43 crc kubenswrapper[4733]: I1206 06:23:43.245815 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" exitCode=0 Dec 06 06:23:43 crc kubenswrapper[4733]: I1206 06:23:43.245891 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84"} Dec 06 06:23:43 crc kubenswrapper[4733]: I1206 06:23:43.246139 4733 scope.go:117] "RemoveContainer" containerID="e8732ef17439b6776bf46c0bcb254c37c40bb5859ffecc34e09d39650c0b7d3e" Dec 06 06:23:43 crc kubenswrapper[4733]: I1206 06:23:43.246908 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:23:43 crc kubenswrapper[4733]: E1206 06:23:43.247208 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:23:54 crc kubenswrapper[4733]: I1206 06:23:54.485362 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:23:54 crc kubenswrapper[4733]: E1206 06:23:54.487416 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:24:05 crc kubenswrapper[4733]: I1206 06:24:05.485578 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:24:05 crc kubenswrapper[4733]: E1206 06:24:05.486606 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:24:20 crc kubenswrapper[4733]: I1206 06:24:20.486025 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:24:20 crc kubenswrapper[4733]: E1206 06:24:20.487214 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:24:35 crc kubenswrapper[4733]: I1206 06:24:35.484647 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:24:35 crc kubenswrapper[4733]: E1206 06:24:35.485537 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:24:49 crc kubenswrapper[4733]: I1206 06:24:49.484952 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:24:49 crc kubenswrapper[4733]: E1206 06:24:49.486127 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:25:01 crc kubenswrapper[4733]: I1206 06:25:01.484482 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:25:01 crc kubenswrapper[4733]: E1206 06:25:01.485445 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:25:15 crc kubenswrapper[4733]: I1206 06:25:15.485252 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:25:15 crc kubenswrapper[4733]: E1206 06:25:15.486094 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:25:30 crc kubenswrapper[4733]: I1206 06:25:30.485230 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:25:30 crc kubenswrapper[4733]: E1206 06:25:30.486346 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:25:44 crc kubenswrapper[4733]: I1206 06:25:44.485929 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:25:44 crc kubenswrapper[4733]: E1206 06:25:44.488183 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:25:57 crc kubenswrapper[4733]: I1206 06:25:57.485202 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:25:57 crc kubenswrapper[4733]: E1206 06:25:57.485990 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:26:10 crc kubenswrapper[4733]: I1206 06:26:10.487515 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:26:10 crc kubenswrapper[4733]: E1206 06:26:10.490853 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:26:25 crc kubenswrapper[4733]: I1206 06:26:25.485470 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:26:25 crc kubenswrapper[4733]: E1206 06:26:25.486569 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:26:36 crc kubenswrapper[4733]: I1206 06:26:36.490866 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:26:36 crc kubenswrapper[4733]: E1206 06:26:36.491993 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:26:48 crc kubenswrapper[4733]: I1206 06:26:48.484584 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:26:48 crc kubenswrapper[4733]: E1206 06:26:48.486944 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:27:00 crc kubenswrapper[4733]: I1206 06:27:00.486356 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:27:00 crc kubenswrapper[4733]: E1206 06:27:00.487442 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:27:15 crc kubenswrapper[4733]: I1206 06:27:15.485010 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:27:15 crc kubenswrapper[4733]: E1206 06:27:15.485878 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:27:30 crc kubenswrapper[4733]: I1206 06:27:30.484828 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:27:30 crc kubenswrapper[4733]: E1206 06:27:30.485782 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:27:41 crc kubenswrapper[4733]: I1206 06:27:41.485465 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:27:41 crc kubenswrapper[4733]: E1206 06:27:41.486350 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:27:53 crc kubenswrapper[4733]: I1206 06:27:53.485504 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:27:53 crc kubenswrapper[4733]: E1206 06:27:53.486443 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:28:07 crc kubenswrapper[4733]: I1206 06:28:07.486061 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:28:07 crc kubenswrapper[4733]: E1206 06:28:07.487164 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:28:18 crc kubenswrapper[4733]: I1206 06:28:18.484423 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:28:18 crc kubenswrapper[4733]: E1206 06:28:18.485385 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:28:31 crc kubenswrapper[4733]: I1206 06:28:31.484950 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:28:31 crc kubenswrapper[4733]: E1206 06:28:31.485899 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:28:45 crc kubenswrapper[4733]: I1206 06:28:45.484470 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:28:46 crc kubenswrapper[4733]: I1206 06:28:46.063320 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215"} Dec 06 06:29:35 crc kubenswrapper[4733]: I1206 06:29:35.845701 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:35 crc kubenswrapper[4733]: I1206 06:29:35.848325 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:35 crc kubenswrapper[4733]: I1206 06:29:35.856901 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.044484 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.044560 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.044781 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2h2\" (UniqueName: \"kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.146671 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2h2\" (UniqueName: \"kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.147027 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.147063 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.147548 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.150352 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.165599 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2h2\" (UniqueName: \"kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2\") pod \"community-operators-xmw44\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.464276 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:36 crc kubenswrapper[4733]: I1206 06:29:36.884347 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:37 crc kubenswrapper[4733]: I1206 06:29:37.559119 4733 generic.go:334] "Generic (PLEG): container finished" podID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerID="b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b" exitCode=0 Dec 06 06:29:37 crc kubenswrapper[4733]: I1206 06:29:37.559434 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerDied","Data":"b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b"} Dec 06 06:29:37 crc kubenswrapper[4733]: I1206 06:29:37.559569 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerStarted","Data":"6f07b42e27fd3e01baa0e403f6640210d783b3bc5fb3bc990edf32dc628464cc"} Dec 06 06:29:37 crc kubenswrapper[4733]: I1206 06:29:37.561743 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:29:38 crc kubenswrapper[4733]: I1206 06:29:38.569783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerStarted","Data":"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547"} Dec 06 06:29:39 crc kubenswrapper[4733]: I1206 06:29:39.581790 4733 generic.go:334] "Generic (PLEG): container finished" podID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerID="97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547" exitCode=0 Dec 06 06:29:39 crc kubenswrapper[4733]: I1206 06:29:39.581848 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerDied","Data":"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547"} Dec 06 06:29:40 crc kubenswrapper[4733]: I1206 06:29:40.596550 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerStarted","Data":"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11"} Dec 06 06:29:40 crc kubenswrapper[4733]: I1206 06:29:40.618805 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xmw44" podStartSLOduration=3.113269311 podStartE2EDuration="5.618786814s" podCreationTimestamp="2025-12-06 06:29:35 +0000 UTC" firstStartedPulling="2025-12-06 06:29:37.561429443 +0000 UTC m=+2761.426640555" lastFinishedPulling="2025-12-06 06:29:40.066946948 +0000 UTC m=+2763.932158058" observedRunningTime="2025-12-06 06:29:40.614937646 +0000 UTC m=+2764.480148758" watchObservedRunningTime="2025-12-06 06:29:40.618786814 +0000 UTC m=+2764.483997924" Dec 06 06:29:46 crc kubenswrapper[4733]: I1206 06:29:46.465210 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:46 crc kubenswrapper[4733]: I1206 06:29:46.466015 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:46 crc kubenswrapper[4733]: I1206 06:29:46.508706 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:46 crc kubenswrapper[4733]: I1206 06:29:46.695212 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:46 crc kubenswrapper[4733]: I1206 06:29:46.741446 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:48 crc kubenswrapper[4733]: I1206 06:29:48.675642 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xmw44" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="registry-server" containerID="cri-o://44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11" gracePeriod=2 Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.082220 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.113036 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz2h2\" (UniqueName: \"kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2\") pod \"463a1797-46a8-4a46-b14a-4f36f5708a60\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.113089 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities\") pod \"463a1797-46a8-4a46-b14a-4f36f5708a60\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.113136 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content\") pod \"463a1797-46a8-4a46-b14a-4f36f5708a60\" (UID: \"463a1797-46a8-4a46-b14a-4f36f5708a60\") " Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.117415 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities" (OuterVolumeSpecName: "utilities") pod "463a1797-46a8-4a46-b14a-4f36f5708a60" (UID: "463a1797-46a8-4a46-b14a-4f36f5708a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.123711 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2" (OuterVolumeSpecName: "kube-api-access-dz2h2") pod "463a1797-46a8-4a46-b14a-4f36f5708a60" (UID: "463a1797-46a8-4a46-b14a-4f36f5708a60"). InnerVolumeSpecName "kube-api-access-dz2h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.158812 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "463a1797-46a8-4a46-b14a-4f36f5708a60" (UID: "463a1797-46a8-4a46-b14a-4f36f5708a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.216421 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz2h2\" (UniqueName: \"kubernetes.io/projected/463a1797-46a8-4a46-b14a-4f36f5708a60-kube-api-access-dz2h2\") on node \"crc\" DevicePath \"\"" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.216454 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.216466 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463a1797-46a8-4a46-b14a-4f36f5708a60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.689905 4733 generic.go:334] "Generic (PLEG): container finished" podID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerID="44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11" exitCode=0 Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.689961 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerDied","Data":"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11"} Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.689975 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw44" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.690011 4733 scope.go:117] "RemoveContainer" containerID="44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.689997 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw44" event={"ID":"463a1797-46a8-4a46-b14a-4f36f5708a60","Type":"ContainerDied","Data":"6f07b42e27fd3e01baa0e403f6640210d783b3bc5fb3bc990edf32dc628464cc"} Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.724565 4733 scope.go:117] "RemoveContainer" containerID="97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.724776 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.735767 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xmw44"] Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.763167 4733 scope.go:117] "RemoveContainer" containerID="b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.783646 4733 scope.go:117] "RemoveContainer" containerID="44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11" Dec 06 06:29:49 crc kubenswrapper[4733]: E1206 06:29:49.784340 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11\": container with ID starting with 44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11 not found: ID does not exist" containerID="44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.784390 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11"} err="failed to get container status \"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11\": rpc error: code = NotFound desc = could not find container \"44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11\": container with ID starting with 44e84cf636bb86240b560b21f70e618f1411e967618b90651d093924d4b06c11 not found: ID does not exist" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.784449 4733 scope.go:117] "RemoveContainer" containerID="97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547" Dec 06 06:29:49 crc kubenswrapper[4733]: E1206 06:29:49.784899 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547\": container with ID starting with 97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547 not found: ID does not exist" containerID="97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.784936 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547"} err="failed to get container status \"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547\": rpc error: code = NotFound desc = could not find container \"97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547\": container with ID starting with 97c3969f8100e7e44814ccd254a0ff03f1deaa78bad41a688d002cabc5eff547 not found: ID does not exist" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.784964 4733 scope.go:117] "RemoveContainer" containerID="b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b" Dec 06 06:29:49 crc kubenswrapper[4733]: E1206 06:29:49.785430 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b\": container with ID starting with b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b not found: ID does not exist" containerID="b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b" Dec 06 06:29:49 crc kubenswrapper[4733]: I1206 06:29:49.785455 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b"} err="failed to get container status \"b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b\": rpc error: code = NotFound desc = could not find container \"b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b\": container with ID starting with b214f662f7c8366b2b7edf7ae27bbfeb28424a605d2cf542cbf53c364c17331b not found: ID does not exist" Dec 06 06:29:50 crc kubenswrapper[4733]: I1206 06:29:50.495819 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" path="/var/lib/kubelet/pods/463a1797-46a8-4a46-b14a-4f36f5708a60/volumes" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.135525 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx"] Dec 06 06:30:00 crc kubenswrapper[4733]: E1206 06:30:00.136473 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="extract-content" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.136488 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="extract-content" Dec 06 06:30:00 crc kubenswrapper[4733]: E1206 06:30:00.136497 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="registry-server" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.136503 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="registry-server" Dec 06 06:30:00 crc kubenswrapper[4733]: E1206 06:30:00.136511 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="extract-utilities" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.136518 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="extract-utilities" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.136680 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="463a1797-46a8-4a46-b14a-4f36f5708a60" containerName="registry-server" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.137288 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.139089 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.139408 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.146870 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx"] Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.234632 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8652z\" (UniqueName: \"kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.234714 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.234915 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.336781 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8652z\" (UniqueName: \"kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.337085 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.337168 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.338043 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.342867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.351650 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8652z\" (UniqueName: \"kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z\") pod \"collect-profiles-29416710-4gkfx\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.456361 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:00 crc kubenswrapper[4733]: I1206 06:30:00.860123 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx"] Dec 06 06:30:01 crc kubenswrapper[4733]: I1206 06:30:01.805424 4733 generic.go:334] "Generic (PLEG): container finished" podID="3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" containerID="7b8774e6764b1f1ec16e11f20ee607039726fdc5d6644f2f980a52504b2b9df9" exitCode=0 Dec 06 06:30:01 crc kubenswrapper[4733]: I1206 06:30:01.805529 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" event={"ID":"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec","Type":"ContainerDied","Data":"7b8774e6764b1f1ec16e11f20ee607039726fdc5d6644f2f980a52504b2b9df9"} Dec 06 06:30:01 crc kubenswrapper[4733]: I1206 06:30:01.805828 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" event={"ID":"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec","Type":"ContainerStarted","Data":"07f732650331fb137a9ab60bc1984b782efa473fe90c7f8a9c096d81ac3ce37e"} Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.092889 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.202108 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume\") pod \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.202165 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8652z\" (UniqueName: \"kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z\") pod \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.202225 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume\") pod \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\" (UID: \"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec\") " Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.203120 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume" (OuterVolumeSpecName: "config-volume") pod "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" (UID: "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.210085 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" (UID: "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.210157 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z" (OuterVolumeSpecName: "kube-api-access-8652z") pod "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" (UID: "3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec"). InnerVolumeSpecName "kube-api-access-8652z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.306388 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.306438 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8652z\" (UniqueName: \"kubernetes.io/projected/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-kube-api-access-8652z\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.306453 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.824205 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" event={"ID":"3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec","Type":"ContainerDied","Data":"07f732650331fb137a9ab60bc1984b782efa473fe90c7f8a9c096d81ac3ce37e"} Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.824266 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f732650331fb137a9ab60bc1984b782efa473fe90c7f8a9c096d81ac3ce37e" Dec 06 06:30:03 crc kubenswrapper[4733]: I1206 06:30:03.824290 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416710-4gkfx" Dec 06 06:30:04 crc kubenswrapper[4733]: I1206 06:30:04.162762 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl"] Dec 06 06:30:04 crc kubenswrapper[4733]: I1206 06:30:04.173773 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416665-87xkl"] Dec 06 06:30:04 crc kubenswrapper[4733]: I1206 06:30:04.496653 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed0db75-d198-42d8-ac27-91145205f42c" path="/var/lib/kubelet/pods/1ed0db75-d198-42d8-ac27-91145205f42c/volumes" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.620558 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:12 crc kubenswrapper[4733]: E1206 06:30:12.621538 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" containerName="collect-profiles" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.621551 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" containerName="collect-profiles" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.621742 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa872cc-c7bd-47a0-8d9d-d7d4cf1d7fec" containerName="collect-profiles" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.623006 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.631743 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.788802 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.788844 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vchb\" (UniqueName: \"kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.789202 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.891056 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.891206 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.891231 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vchb\" (UniqueName: \"kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.891546 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.891718 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.908916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vchb\" (UniqueName: \"kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb\") pod \"redhat-marketplace-dvvmx\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:12 crc kubenswrapper[4733]: I1206 06:30:12.940815 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:13 crc kubenswrapper[4733]: I1206 06:30:13.361987 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:13 crc kubenswrapper[4733]: I1206 06:30:13.929168 4733 generic.go:334] "Generic (PLEG): container finished" podID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerID="8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f" exitCode=0 Dec 06 06:30:13 crc kubenswrapper[4733]: I1206 06:30:13.929278 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerDied","Data":"8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f"} Dec 06 06:30:13 crc kubenswrapper[4733]: I1206 06:30:13.929528 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerStarted","Data":"265ffebc9f3a896cce269feba57b175385d5a2cf5254f319c28139f7a9191061"} Dec 06 06:30:14 crc kubenswrapper[4733]: I1206 06:30:14.942879 4733 generic.go:334] "Generic (PLEG): container finished" podID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerID="0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5" exitCode=0 Dec 06 06:30:14 crc kubenswrapper[4733]: I1206 06:30:14.942974 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerDied","Data":"0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5"} Dec 06 06:30:15 crc kubenswrapper[4733]: I1206 06:30:15.956276 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerStarted","Data":"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d"} Dec 06 06:30:15 crc kubenswrapper[4733]: I1206 06:30:15.980842 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dvvmx" podStartSLOduration=2.461586709 podStartE2EDuration="3.980824369s" podCreationTimestamp="2025-12-06 06:30:12 +0000 UTC" firstStartedPulling="2025-12-06 06:30:13.930868726 +0000 UTC m=+2797.796079836" lastFinishedPulling="2025-12-06 06:30:15.450106385 +0000 UTC m=+2799.315317496" observedRunningTime="2025-12-06 06:30:15.971970403 +0000 UTC m=+2799.837181513" watchObservedRunningTime="2025-12-06 06:30:15.980824369 +0000 UTC m=+2799.846035480" Dec 06 06:30:22 crc kubenswrapper[4733]: I1206 06:30:22.941185 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:22 crc kubenswrapper[4733]: I1206 06:30:22.941858 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:22 crc kubenswrapper[4733]: I1206 06:30:22.978838 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:23 crc kubenswrapper[4733]: I1206 06:30:23.052211 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:23 crc kubenswrapper[4733]: I1206 06:30:23.214416 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.034642 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dvvmx" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="registry-server" containerID="cri-o://0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d" gracePeriod=2 Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.440478 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.548099 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vchb\" (UniqueName: \"kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb\") pod \"0a533cf3-b991-4ed8-97a1-1e58d7006064\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.548475 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities\") pod \"0a533cf3-b991-4ed8-97a1-1e58d7006064\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.548558 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content\") pod \"0a533cf3-b991-4ed8-97a1-1e58d7006064\" (UID: \"0a533cf3-b991-4ed8-97a1-1e58d7006064\") " Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.549174 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities" (OuterVolumeSpecName: "utilities") pod "0a533cf3-b991-4ed8-97a1-1e58d7006064" (UID: "0a533cf3-b991-4ed8-97a1-1e58d7006064"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.549420 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.556442 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb" (OuterVolumeSpecName: "kube-api-access-5vchb") pod "0a533cf3-b991-4ed8-97a1-1e58d7006064" (UID: "0a533cf3-b991-4ed8-97a1-1e58d7006064"). InnerVolumeSpecName "kube-api-access-5vchb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.566676 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a533cf3-b991-4ed8-97a1-1e58d7006064" (UID: "0a533cf3-b991-4ed8-97a1-1e58d7006064"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.651891 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a533cf3-b991-4ed8-97a1-1e58d7006064-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:25 crc kubenswrapper[4733]: I1206 06:30:25.651935 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vchb\" (UniqueName: \"kubernetes.io/projected/0a533cf3-b991-4ed8-97a1-1e58d7006064-kube-api-access-5vchb\") on node \"crc\" DevicePath \"\"" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.046370 4733 generic.go:334] "Generic (PLEG): container finished" podID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerID="0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d" exitCode=0 Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.046430 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerDied","Data":"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d"} Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.046493 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvmx" event={"ID":"0a533cf3-b991-4ed8-97a1-1e58d7006064","Type":"ContainerDied","Data":"265ffebc9f3a896cce269feba57b175385d5a2cf5254f319c28139f7a9191061"} Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.046517 4733 scope.go:117] "RemoveContainer" containerID="0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.046512 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvmx" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.067594 4733 scope.go:117] "RemoveContainer" containerID="0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.079688 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.088338 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvmx"] Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.097236 4733 scope.go:117] "RemoveContainer" containerID="8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.130171 4733 scope.go:117] "RemoveContainer" containerID="0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d" Dec 06 06:30:26 crc kubenswrapper[4733]: E1206 06:30:26.130748 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d\": container with ID starting with 0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d not found: ID does not exist" containerID="0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.130809 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d"} err="failed to get container status \"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d\": rpc error: code = NotFound desc = could not find container \"0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d\": container with ID starting with 0b6497dd36e08b5e31e32949cb3fe1fdec58505ffca9be6e46b1d7cde035720d not found: ID does not exist" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.130846 4733 scope.go:117] "RemoveContainer" containerID="0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5" Dec 06 06:30:26 crc kubenswrapper[4733]: E1206 06:30:26.131177 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5\": container with ID starting with 0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5 not found: ID does not exist" containerID="0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.131197 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5"} err="failed to get container status \"0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5\": rpc error: code = NotFound desc = could not find container \"0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5\": container with ID starting with 0d4a5e72a417af9ff61107b9cc7102a2959acde4af818792a81325f551be8dd5 not found: ID does not exist" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.131211 4733 scope.go:117] "RemoveContainer" containerID="8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f" Dec 06 06:30:26 crc kubenswrapper[4733]: E1206 06:30:26.131567 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f\": container with ID starting with 8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f not found: ID does not exist" containerID="8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.131590 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f"} err="failed to get container status \"8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f\": rpc error: code = NotFound desc = could not find container \"8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f\": container with ID starting with 8a35ece2aa4c3802da494cfadb038c6f896db4558cf280b217a550c7c7b3b08f not found: ID does not exist" Dec 06 06:30:26 crc kubenswrapper[4733]: I1206 06:30:26.494359 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" path="/var/lib/kubelet/pods/0a533cf3-b991-4ed8-97a1-1e58d7006064/volumes" Dec 06 06:30:52 crc kubenswrapper[4733]: I1206 06:30:52.839123 4733 scope.go:117] "RemoveContainer" containerID="281433015c50230243bb4612f9bf0ac2baf19caea7ec2458143f93de4f94a72a" Dec 06 06:31:12 crc kubenswrapper[4733]: I1206 06:31:12.989049 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:31:12 crc kubenswrapper[4733]: I1206 06:31:12.989800 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:31:42 crc kubenswrapper[4733]: I1206 06:31:42.989780 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:31:42 crc kubenswrapper[4733]: I1206 06:31:42.990450 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.361994 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:07 crc kubenswrapper[4733]: E1206 06:32:07.362897 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="extract-utilities" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.362912 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="extract-utilities" Dec 06 06:32:07 crc kubenswrapper[4733]: E1206 06:32:07.362931 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="extract-content" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.362936 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="extract-content" Dec 06 06:32:07 crc kubenswrapper[4733]: E1206 06:32:07.362971 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="registry-server" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.362978 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="registry-server" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.363169 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a533cf3-b991-4ed8-97a1-1e58d7006064" containerName="registry-server" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.364508 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.378168 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.428518 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.428629 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.428679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wr4m\" (UniqueName: \"kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.530460 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.530522 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wr4m\" (UniqueName: \"kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.530657 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.531426 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.531472 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.553196 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wr4m\" (UniqueName: \"kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m\") pod \"redhat-operators-pgm57\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:07 crc kubenswrapper[4733]: I1206 06:32:07.682026 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:08 crc kubenswrapper[4733]: I1206 06:32:08.115835 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:08 crc kubenswrapper[4733]: I1206 06:32:08.994231 4733 generic.go:334] "Generic (PLEG): container finished" podID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerID="2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2" exitCode=0 Dec 06 06:32:08 crc kubenswrapper[4733]: I1206 06:32:08.994366 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerDied","Data":"2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2"} Dec 06 06:32:08 crc kubenswrapper[4733]: I1206 06:32:08.994724 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerStarted","Data":"f3985a189e731cf9e6e621b4c3c86ba15e72564e2ae7649ae0bc977b5b1bc7ca"} Dec 06 06:32:10 crc kubenswrapper[4733]: I1206 06:32:10.003128 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerStarted","Data":"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359"} Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.024090 4733 generic.go:334] "Generic (PLEG): container finished" podID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerID="6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359" exitCode=0 Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.024199 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerDied","Data":"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359"} Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.989850 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.990431 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.990485 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.991187 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:32:12 crc kubenswrapper[4733]: I1206 06:32:12.991241 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215" gracePeriod=600 Dec 06 06:32:13 crc kubenswrapper[4733]: I1206 06:32:13.034603 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerStarted","Data":"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794"} Dec 06 06:32:13 crc kubenswrapper[4733]: I1206 06:32:13.059323 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgm57" podStartSLOduration=2.563496573 podStartE2EDuration="6.059288011s" podCreationTimestamp="2025-12-06 06:32:07 +0000 UTC" firstStartedPulling="2025-12-06 06:32:08.996390156 +0000 UTC m=+2912.861601266" lastFinishedPulling="2025-12-06 06:32:12.492181594 +0000 UTC m=+2916.357392704" observedRunningTime="2025-12-06 06:32:13.053271019 +0000 UTC m=+2916.918482129" watchObservedRunningTime="2025-12-06 06:32:13.059288011 +0000 UTC m=+2916.924499121" Dec 06 06:32:14 crc kubenswrapper[4733]: I1206 06:32:14.049060 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215" exitCode=0 Dec 06 06:32:14 crc kubenswrapper[4733]: I1206 06:32:14.049140 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215"} Dec 06 06:32:14 crc kubenswrapper[4733]: I1206 06:32:14.049689 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1"} Dec 06 06:32:14 crc kubenswrapper[4733]: I1206 06:32:14.049716 4733 scope.go:117] "RemoveContainer" containerID="9ce999f28ba3ed207346dfc647ce3d354abbc70fd027614f9498a4b9f7a14f84" Dec 06 06:32:17 crc kubenswrapper[4733]: I1206 06:32:17.682153 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:17 crc kubenswrapper[4733]: I1206 06:32:17.682811 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:18 crc kubenswrapper[4733]: I1206 06:32:18.725794 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgm57" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="registry-server" probeResult="failure" output=< Dec 06 06:32:18 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Dec 06 06:32:18 crc kubenswrapper[4733]: > Dec 06 06:32:27 crc kubenswrapper[4733]: I1206 06:32:27.726807 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:27 crc kubenswrapper[4733]: I1206 06:32:27.770930 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:27 crc kubenswrapper[4733]: I1206 06:32:27.964192 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.202450 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pgm57" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="registry-server" containerID="cri-o://4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794" gracePeriod=2 Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.621831 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.783326 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities\") pod \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.783637 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wr4m\" (UniqueName: \"kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m\") pod \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.783821 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content\") pod \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\" (UID: \"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b\") " Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.784296 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities" (OuterVolumeSpecName: "utilities") pod "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" (UID: "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.785077 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.790860 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m" (OuterVolumeSpecName: "kube-api-access-7wr4m") pod "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" (UID: "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b"). InnerVolumeSpecName "kube-api-access-7wr4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.876432 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" (UID: "229e887b-d4b7-4c4b-bc32-7bf11c23ef4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.887253 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wr4m\" (UniqueName: \"kubernetes.io/projected/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-kube-api-access-7wr4m\") on node \"crc\" DevicePath \"\"" Dec 06 06:32:29 crc kubenswrapper[4733]: I1206 06:32:29.887286 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.216351 4733 generic.go:334] "Generic (PLEG): container finished" podID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerID="4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794" exitCode=0 Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.216531 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgm57" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.216586 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerDied","Data":"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794"} Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.217417 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgm57" event={"ID":"229e887b-d4b7-4c4b-bc32-7bf11c23ef4b","Type":"ContainerDied","Data":"f3985a189e731cf9e6e621b4c3c86ba15e72564e2ae7649ae0bc977b5b1bc7ca"} Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.217448 4733 scope.go:117] "RemoveContainer" containerID="4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.250982 4733 scope.go:117] "RemoveContainer" containerID="6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.251172 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.257454 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pgm57"] Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.268377 4733 scope.go:117] "RemoveContainer" containerID="2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.301351 4733 scope.go:117] "RemoveContainer" containerID="4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794" Dec 06 06:32:30 crc kubenswrapper[4733]: E1206 06:32:30.301680 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794\": container with ID starting with 4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794 not found: ID does not exist" containerID="4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.301723 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794"} err="failed to get container status \"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794\": rpc error: code = NotFound desc = could not find container \"4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794\": container with ID starting with 4977edfd8d30a1e9ec61debc0d391c1009e4596c2539679526ff14a5a604c794 not found: ID does not exist" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.301751 4733 scope.go:117] "RemoveContainer" containerID="6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359" Dec 06 06:32:30 crc kubenswrapper[4733]: E1206 06:32:30.302036 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359\": container with ID starting with 6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359 not found: ID does not exist" containerID="6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.302127 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359"} err="failed to get container status \"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359\": rpc error: code = NotFound desc = could not find container \"6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359\": container with ID starting with 6410aa032e0a6263a8845e3cf4d895a9f11308b599a2527613f763dbc9a3b359 not found: ID does not exist" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.302200 4733 scope.go:117] "RemoveContainer" containerID="2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2" Dec 06 06:32:30 crc kubenswrapper[4733]: E1206 06:32:30.302528 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2\": container with ID starting with 2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2 not found: ID does not exist" containerID="2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.302556 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2"} err="failed to get container status \"2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2\": rpc error: code = NotFound desc = could not find container \"2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2\": container with ID starting with 2cc88b46554bb25c97d443fe5a0fa9bdfce419bc393e66dca99f6d9e8b1e12d2 not found: ID does not exist" Dec 06 06:32:30 crc kubenswrapper[4733]: I1206 06:32:30.513180 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" path="/var/lib/kubelet/pods/229e887b-d4b7-4c4b-bc32-7bf11c23ef4b/volumes" Dec 06 06:34:42 crc kubenswrapper[4733]: I1206 06:34:42.989191 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:34:42 crc kubenswrapper[4733]: I1206 06:34:42.989968 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:35:12 crc kubenswrapper[4733]: I1206 06:35:12.989596 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:35:12 crc kubenswrapper[4733]: I1206 06:35:12.990291 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:35:42 crc kubenswrapper[4733]: I1206 06:35:42.989050 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:35:42 crc kubenswrapper[4733]: I1206 06:35:42.989821 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:35:42 crc kubenswrapper[4733]: I1206 06:35:42.989883 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:35:42 crc kubenswrapper[4733]: I1206 06:35:42.991057 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:35:42 crc kubenswrapper[4733]: I1206 06:35:42.991117 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" gracePeriod=600 Dec 06 06:35:43 crc kubenswrapper[4733]: E1206 06:35:43.110817 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:35:44 crc kubenswrapper[4733]: I1206 06:35:44.056094 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" exitCode=0 Dec 06 06:35:44 crc kubenswrapper[4733]: I1206 06:35:44.056147 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1"} Dec 06 06:35:44 crc kubenswrapper[4733]: I1206 06:35:44.056576 4733 scope.go:117] "RemoveContainer" containerID="89fe3837db40c5ded85d4685d49c79313ccdd8984a9c4dfbcd4fc13fbc674215" Dec 06 06:35:44 crc kubenswrapper[4733]: I1206 06:35:44.057201 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:35:44 crc kubenswrapper[4733]: E1206 06:35:44.057677 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.098113 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:35:53 crc kubenswrapper[4733]: E1206 06:35:53.099119 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="registry-server" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.099135 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="registry-server" Dec 06 06:35:53 crc kubenswrapper[4733]: E1206 06:35:53.099166 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="extract-utilities" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.099172 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="extract-utilities" Dec 06 06:35:53 crc kubenswrapper[4733]: E1206 06:35:53.099186 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="extract-content" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.099193 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="extract-content" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.099435 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="229e887b-d4b7-4c4b-bc32-7bf11c23ef4b" containerName="registry-server" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.100871 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.106610 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.216209 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsr4m\" (UniqueName: \"kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.216373 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.216451 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.318426 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.318754 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.318954 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsr4m\" (UniqueName: \"kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.318950 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.319179 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.340243 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsr4m\" (UniqueName: \"kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m\") pod \"certified-operators-qpjrr\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.419430 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:35:53 crc kubenswrapper[4733]: I1206 06:35:53.828864 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:35:54 crc kubenswrapper[4733]: I1206 06:35:54.154735 4733 generic.go:334] "Generic (PLEG): container finished" podID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerID="5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544" exitCode=0 Dec 06 06:35:54 crc kubenswrapper[4733]: I1206 06:35:54.154836 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerDied","Data":"5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544"} Dec 06 06:35:54 crc kubenswrapper[4733]: I1206 06:35:54.155081 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerStarted","Data":"cbfd6ea7911c57c7c0d2d6dfab53ef571483365c5222a48444672d618be685f9"} Dec 06 06:35:54 crc kubenswrapper[4733]: I1206 06:35:54.156361 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:35:55 crc kubenswrapper[4733]: I1206 06:35:55.165074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerStarted","Data":"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a"} Dec 06 06:35:56 crc kubenswrapper[4733]: I1206 06:35:56.178518 4733 generic.go:334] "Generic (PLEG): container finished" podID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerID="efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a" exitCode=0 Dec 06 06:35:56 crc kubenswrapper[4733]: I1206 06:35:56.178639 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerDied","Data":"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a"} Dec 06 06:35:57 crc kubenswrapper[4733]: I1206 06:35:57.196085 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerStarted","Data":"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93"} Dec 06 06:35:57 crc kubenswrapper[4733]: I1206 06:35:57.224042 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qpjrr" podStartSLOduration=1.725606692 podStartE2EDuration="4.224021273s" podCreationTimestamp="2025-12-06 06:35:53 +0000 UTC" firstStartedPulling="2025-12-06 06:35:54.156094202 +0000 UTC m=+3138.021305313" lastFinishedPulling="2025-12-06 06:35:56.654508783 +0000 UTC m=+3140.519719894" observedRunningTime="2025-12-06 06:35:57.216432936 +0000 UTC m=+3141.081644047" watchObservedRunningTime="2025-12-06 06:35:57.224021273 +0000 UTC m=+3141.089232385" Dec 06 06:35:59 crc kubenswrapper[4733]: I1206 06:35:59.485627 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:35:59 crc kubenswrapper[4733]: E1206 06:35:59.486939 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:36:03 crc kubenswrapper[4733]: I1206 06:36:03.419488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:03 crc kubenswrapper[4733]: I1206 06:36:03.420019 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:03 crc kubenswrapper[4733]: I1206 06:36:03.458707 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:04 crc kubenswrapper[4733]: I1206 06:36:04.304722 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:04 crc kubenswrapper[4733]: I1206 06:36:04.351588 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.281637 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qpjrr" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="registry-server" containerID="cri-o://af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93" gracePeriod=2 Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.668203 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.723402 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsr4m\" (UniqueName: \"kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m\") pod \"2236d7ba-77d3-475f-ba67-715a76b09f28\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.723568 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content\") pod \"2236d7ba-77d3-475f-ba67-715a76b09f28\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.723604 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities\") pod \"2236d7ba-77d3-475f-ba67-715a76b09f28\" (UID: \"2236d7ba-77d3-475f-ba67-715a76b09f28\") " Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.742972 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities" (OuterVolumeSpecName: "utilities") pod "2236d7ba-77d3-475f-ba67-715a76b09f28" (UID: "2236d7ba-77d3-475f-ba67-715a76b09f28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.754515 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m" (OuterVolumeSpecName: "kube-api-access-gsr4m") pod "2236d7ba-77d3-475f-ba67-715a76b09f28" (UID: "2236d7ba-77d3-475f-ba67-715a76b09f28"). InnerVolumeSpecName "kube-api-access-gsr4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.831009 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.831075 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsr4m\" (UniqueName: \"kubernetes.io/projected/2236d7ba-77d3-475f-ba67-715a76b09f28-kube-api-access-gsr4m\") on node \"crc\" DevicePath \"\"" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.835040 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2236d7ba-77d3-475f-ba67-715a76b09f28" (UID: "2236d7ba-77d3-475f-ba67-715a76b09f28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:36:06 crc kubenswrapper[4733]: I1206 06:36:06.932587 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2236d7ba-77d3-475f-ba67-715a76b09f28-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.298148 4733 generic.go:334] "Generic (PLEG): container finished" podID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerID="af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93" exitCode=0 Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.298249 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpjrr" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.298255 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerDied","Data":"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93"} Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.298569 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpjrr" event={"ID":"2236d7ba-77d3-475f-ba67-715a76b09f28","Type":"ContainerDied","Data":"cbfd6ea7911c57c7c0d2d6dfab53ef571483365c5222a48444672d618be685f9"} Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.298593 4733 scope.go:117] "RemoveContainer" containerID="af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.321047 4733 scope.go:117] "RemoveContainer" containerID="efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.329340 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.334719 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qpjrr"] Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.353943 4733 scope.go:117] "RemoveContainer" containerID="5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.373017 4733 scope.go:117] "RemoveContainer" containerID="af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93" Dec 06 06:36:07 crc kubenswrapper[4733]: E1206 06:36:07.373404 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93\": container with ID starting with af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93 not found: ID does not exist" containerID="af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.373438 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93"} err="failed to get container status \"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93\": rpc error: code = NotFound desc = could not find container \"af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93\": container with ID starting with af8921057e3dbd92d5090f2e8919bd7dd2bdbf97557e2aa46fce12f70723fd93 not found: ID does not exist" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.373465 4733 scope.go:117] "RemoveContainer" containerID="efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a" Dec 06 06:36:07 crc kubenswrapper[4733]: E1206 06:36:07.373799 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a\": container with ID starting with efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a not found: ID does not exist" containerID="efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.373831 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a"} err="failed to get container status \"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a\": rpc error: code = NotFound desc = could not find container \"efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a\": container with ID starting with efe654ffd91729927a72b62d93f2b68459c33ee8dd02d90aa3dec0170de63a7a not found: ID does not exist" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.373853 4733 scope.go:117] "RemoveContainer" containerID="5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544" Dec 06 06:36:07 crc kubenswrapper[4733]: E1206 06:36:07.374101 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544\": container with ID starting with 5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544 not found: ID does not exist" containerID="5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544" Dec 06 06:36:07 crc kubenswrapper[4733]: I1206 06:36:07.374122 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544"} err="failed to get container status \"5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544\": rpc error: code = NotFound desc = could not find container \"5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544\": container with ID starting with 5db4adeb3150d07589194b8b6a705922e52a8d9b46d6a92b9a8a216a121e0544 not found: ID does not exist" Dec 06 06:36:08 crc kubenswrapper[4733]: I1206 06:36:08.496736 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" path="/var/lib/kubelet/pods/2236d7ba-77d3-475f-ba67-715a76b09f28/volumes" Dec 06 06:36:14 crc kubenswrapper[4733]: I1206 06:36:14.485618 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:36:14 crc kubenswrapper[4733]: E1206 06:36:14.486579 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:36:27 crc kubenswrapper[4733]: I1206 06:36:27.485006 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:36:27 crc kubenswrapper[4733]: E1206 06:36:27.486085 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:36:39 crc kubenswrapper[4733]: I1206 06:36:39.484512 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:36:39 crc kubenswrapper[4733]: E1206 06:36:39.485450 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:36:51 crc kubenswrapper[4733]: I1206 06:36:51.486533 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:36:51 crc kubenswrapper[4733]: E1206 06:36:51.487497 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:03 crc kubenswrapper[4733]: I1206 06:37:03.484658 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:37:03 crc kubenswrapper[4733]: E1206 06:37:03.485462 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:17 crc kubenswrapper[4733]: I1206 06:37:17.485190 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:37:17 crc kubenswrapper[4733]: E1206 06:37:17.486206 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:32 crc kubenswrapper[4733]: I1206 06:37:32.485141 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:37:32 crc kubenswrapper[4733]: E1206 06:37:32.485962 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:45 crc kubenswrapper[4733]: I1206 06:37:45.485589 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:37:45 crc kubenswrapper[4733]: E1206 06:37:45.486582 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:52 crc kubenswrapper[4733]: I1206 06:37:52.259353 4733 generic.go:334] "Generic (PLEG): container finished" podID="677c0cf0-716e-467c-ac8b-0cd446fb11ed" containerID="0f5bdb5ba39a3245104201cf3960d47cc685b5696f3fd7b9ea7807a9d03656e8" exitCode=0 Dec 06 06:37:52 crc kubenswrapper[4733]: I1206 06:37:52.259454 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"677c0cf0-716e-467c-ac8b-0cd446fb11ed","Type":"ContainerDied","Data":"0f5bdb5ba39a3245104201cf3960d47cc685b5696f3fd7b9ea7807a9d03656e8"} Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.573333 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.697639 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.697716 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.697797 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.697846 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.697937 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6svrh\" (UniqueName: \"kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.698035 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.698159 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.698211 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.698329 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data\") pod \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\" (UID: \"677c0cf0-716e-467c-ac8b-0cd446fb11ed\") " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.698492 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.699247 4733 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.699991 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data" (OuterVolumeSpecName: "config-data") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.704179 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.704385 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh" (OuterVolumeSpecName: "kube-api-access-6svrh") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "kube-api-access-6svrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.704841 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.725326 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.728172 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.728200 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.741876 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "677c0cf0-716e-467c-ac8b-0cd446fb11ed" (UID: "677c0cf0-716e-467c-ac8b-0cd446fb11ed"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802174 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802211 4733 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/677c0cf0-716e-467c-ac8b-0cd446fb11ed-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802227 4733 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802271 4733 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802287 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6svrh\" (UniqueName: \"kubernetes.io/projected/677c0cf0-716e-467c-ac8b-0cd446fb11ed-kube-api-access-6svrh\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802329 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802346 4733 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.802359 4733 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/677c0cf0-716e-467c-ac8b-0cd446fb11ed-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.819472 4733 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 06 06:37:53 crc kubenswrapper[4733]: I1206 06:37:53.905200 4733 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 06 06:37:54 crc kubenswrapper[4733]: I1206 06:37:54.282104 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"677c0cf0-716e-467c-ac8b-0cd446fb11ed","Type":"ContainerDied","Data":"c6c4dcc05a635a0f9866008f7ae0a82008896c96885962263ff750ebc6af521e"} Dec 06 06:37:54 crc kubenswrapper[4733]: I1206 06:37:54.282171 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c4dcc05a635a0f9866008f7ae0a82008896c96885962263ff750ebc6af521e" Dec 06 06:37:54 crc kubenswrapper[4733]: I1206 06:37:54.282202 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 06:37:58 crc kubenswrapper[4733]: I1206 06:37:58.485886 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:37:58 crc kubenswrapper[4733]: E1206 06:37:58.487356 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.869615 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 06:37:59 crc kubenswrapper[4733]: E1206 06:37:59.870274 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="extract-content" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870288 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="extract-content" Dec 06 06:37:59 crc kubenswrapper[4733]: E1206 06:37:59.870322 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677c0cf0-716e-467c-ac8b-0cd446fb11ed" containerName="tempest-tests-tempest-tests-runner" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870330 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="677c0cf0-716e-467c-ac8b-0cd446fb11ed" containerName="tempest-tests-tempest-tests-runner" Dec 06 06:37:59 crc kubenswrapper[4733]: E1206 06:37:59.870357 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="extract-utilities" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870363 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="extract-utilities" Dec 06 06:37:59 crc kubenswrapper[4733]: E1206 06:37:59.870376 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="registry-server" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870382 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="registry-server" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870558 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="677c0cf0-716e-467c-ac8b-0cd446fb11ed" containerName="tempest-tests-tempest-tests-runner" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.870582 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2236d7ba-77d3-475f-ba67-715a76b09f28" containerName="registry-server" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.871173 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.873066 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gz4r5" Dec 06 06:37:59 crc kubenswrapper[4733]: I1206 06:37:59.879339 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.048084 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbn5\" (UniqueName: \"kubernetes.io/projected/53459053-4118-4161-9dc3-dc7781c1f182-kube-api-access-2zbn5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.048525 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.150902 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.151194 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbn5\" (UniqueName: \"kubernetes.io/projected/53459053-4118-4161-9dc3-dc7781c1f182-kube-api-access-2zbn5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.151367 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.167672 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbn5\" (UniqueName: \"kubernetes.io/projected/53459053-4118-4161-9dc3-dc7781c1f182-kube-api-access-2zbn5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.173968 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"53459053-4118-4161-9dc3-dc7781c1f182\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.188442 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 06:38:00 crc kubenswrapper[4733]: I1206 06:38:00.571918 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 06:38:00 crc kubenswrapper[4733]: W1206 06:38:00.575921 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53459053_4118_4161_9dc3_dc7781c1f182.slice/crio-ac55eb7d8c246f73246e173372f80684b6647cac3dc4fe14731050dedab5a128 WatchSource:0}: Error finding container ac55eb7d8c246f73246e173372f80684b6647cac3dc4fe14731050dedab5a128: Status 404 returned error can't find the container with id ac55eb7d8c246f73246e173372f80684b6647cac3dc4fe14731050dedab5a128 Dec 06 06:38:01 crc kubenswrapper[4733]: I1206 06:38:01.355707 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"53459053-4118-4161-9dc3-dc7781c1f182","Type":"ContainerStarted","Data":"ac55eb7d8c246f73246e173372f80684b6647cac3dc4fe14731050dedab5a128"} Dec 06 06:38:02 crc kubenswrapper[4733]: I1206 06:38:02.366050 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"53459053-4118-4161-9dc3-dc7781c1f182","Type":"ContainerStarted","Data":"b9e3801dcd6f3ae3eb992e35e3c927b3a8d57f36719a6c335e9d4b6741a66716"} Dec 06 06:38:02 crc kubenswrapper[4733]: I1206 06:38:02.385117 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.540635274 podStartE2EDuration="3.385095514s" podCreationTimestamp="2025-12-06 06:37:59 +0000 UTC" firstStartedPulling="2025-12-06 06:38:00.579409955 +0000 UTC m=+3264.444621066" lastFinishedPulling="2025-12-06 06:38:01.423870195 +0000 UTC m=+3265.289081306" observedRunningTime="2025-12-06 06:38:02.377875561 +0000 UTC m=+3266.243086671" watchObservedRunningTime="2025-12-06 06:38:02.385095514 +0000 UTC m=+3266.250306626" Dec 06 06:38:09 crc kubenswrapper[4733]: I1206 06:38:09.484807 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:38:09 crc kubenswrapper[4733]: E1206 06:38:09.485615 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.301912 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v97q5/must-gather-wwhz9"] Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.304021 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.306748 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v97q5"/"openshift-service-ca.crt" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.313086 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v97q5"/"kube-root-ca.crt" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.333618 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v97q5/must-gather-wwhz9"] Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.395475 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.395531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.497533 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.497584 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.497986 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.516634 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd\") pod \"must-gather-wwhz9\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:20 crc kubenswrapper[4733]: I1206 06:38:20.619814 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:38:21 crc kubenswrapper[4733]: I1206 06:38:21.031811 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v97q5/must-gather-wwhz9"] Dec 06 06:38:21 crc kubenswrapper[4733]: I1206 06:38:21.550636 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/must-gather-wwhz9" event={"ID":"a7619f33-d848-42f3-8aa0-98d0339f1f1e","Type":"ContainerStarted","Data":"c639df017338b4c4661447881a5708c6ec642c7866045e1b28de283fc81c21c4"} Dec 06 06:38:24 crc kubenswrapper[4733]: I1206 06:38:24.485066 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:38:24 crc kubenswrapper[4733]: E1206 06:38:24.485896 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:38:27 crc kubenswrapper[4733]: I1206 06:38:27.604959 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/must-gather-wwhz9" event={"ID":"a7619f33-d848-42f3-8aa0-98d0339f1f1e","Type":"ContainerStarted","Data":"5ba8e5e1473b496c58e5a42c14ec97058a1b61f4b2bee60baebb6bd038d68f2d"} Dec 06 06:38:27 crc kubenswrapper[4733]: I1206 06:38:27.605356 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/must-gather-wwhz9" event={"ID":"a7619f33-d848-42f3-8aa0-98d0339f1f1e","Type":"ContainerStarted","Data":"96ef91edcfabee3332b97853ff978a70c62189b5833d5612c41ec5a6e26f6291"} Dec 06 06:38:27 crc kubenswrapper[4733]: I1206 06:38:27.622912 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v97q5/must-gather-wwhz9" podStartSLOduration=1.888458506 podStartE2EDuration="7.622896395s" podCreationTimestamp="2025-12-06 06:38:20 +0000 UTC" firstStartedPulling="2025-12-06 06:38:21.033929316 +0000 UTC m=+3284.899140427" lastFinishedPulling="2025-12-06 06:38:26.768367204 +0000 UTC m=+3290.633578316" observedRunningTime="2025-12-06 06:38:27.618529726 +0000 UTC m=+3291.483740837" watchObservedRunningTime="2025-12-06 06:38:27.622896395 +0000 UTC m=+3291.488107507" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.314673 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v97q5/crc-debug-727m5"] Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.316151 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.318343 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v97q5"/"default-dockercfg-dz6cm" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.418863 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.418927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xb6\" (UniqueName: \"kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.519788 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.520239 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xb6\" (UniqueName: \"kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.520042 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.538269 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xb6\" (UniqueName: \"kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6\") pod \"crc-debug-727m5\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: I1206 06:38:30.633145 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:38:30 crc kubenswrapper[4733]: W1206 06:38:30.668199 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8b2510_5ca0_4d2a_8d3a_e325306586cb.slice/crio-9d272bf6a1067454bb607d432a0abefeef5e8a6f465927b17128c7a661cd95ba WatchSource:0}: Error finding container 9d272bf6a1067454bb607d432a0abefeef5e8a6f465927b17128c7a661cd95ba: Status 404 returned error can't find the container with id 9d272bf6a1067454bb607d432a0abefeef5e8a6f465927b17128c7a661cd95ba Dec 06 06:38:31 crc kubenswrapper[4733]: I1206 06:38:31.642832 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/crc-debug-727m5" event={"ID":"dc8b2510-5ca0-4d2a-8d3a-e325306586cb","Type":"ContainerStarted","Data":"9d272bf6a1067454bb607d432a0abefeef5e8a6f465927b17128c7a661cd95ba"} Dec 06 06:38:36 crc kubenswrapper[4733]: I1206 06:38:36.493703 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:38:36 crc kubenswrapper[4733]: E1206 06:38:36.494391 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:38:44 crc kubenswrapper[4733]: I1206 06:38:44.750839 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/crc-debug-727m5" event={"ID":"dc8b2510-5ca0-4d2a-8d3a-e325306586cb","Type":"ContainerStarted","Data":"cb5e70fb08f1c6026ae68bfdd3ef53d622b6639f0ce0feeace075b8ad058d59e"} Dec 06 06:38:44 crc kubenswrapper[4733]: I1206 06:38:44.770498 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v97q5/crc-debug-727m5" podStartSLOduration=1.625505083 podStartE2EDuration="14.770478233s" podCreationTimestamp="2025-12-06 06:38:30 +0000 UTC" firstStartedPulling="2025-12-06 06:38:30.670420406 +0000 UTC m=+3294.535631518" lastFinishedPulling="2025-12-06 06:38:43.815393557 +0000 UTC m=+3307.680604668" observedRunningTime="2025-12-06 06:38:44.764864458 +0000 UTC m=+3308.630075569" watchObservedRunningTime="2025-12-06 06:38:44.770478233 +0000 UTC m=+3308.635689343" Dec 06 06:38:51 crc kubenswrapper[4733]: I1206 06:38:51.485682 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:38:51 crc kubenswrapper[4733]: E1206 06:38:51.486495 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:00 crc kubenswrapper[4733]: I1206 06:39:00.895849 4733 generic.go:334] "Generic (PLEG): container finished" podID="dc8b2510-5ca0-4d2a-8d3a-e325306586cb" containerID="cb5e70fb08f1c6026ae68bfdd3ef53d622b6639f0ce0feeace075b8ad058d59e" exitCode=0 Dec 06 06:39:00 crc kubenswrapper[4733]: I1206 06:39:00.896398 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/crc-debug-727m5" event={"ID":"dc8b2510-5ca0-4d2a-8d3a-e325306586cb","Type":"ContainerDied","Data":"cb5e70fb08f1c6026ae68bfdd3ef53d622b6639f0ce0feeace075b8ad058d59e"} Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.003845 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.025710 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v97q5/crc-debug-727m5"] Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.032956 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v97q5/crc-debug-727m5"] Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.157951 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xb6\" (UniqueName: \"kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6\") pod \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.158283 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host\") pod \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\" (UID: \"dc8b2510-5ca0-4d2a-8d3a-e325306586cb\") " Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.158665 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host" (OuterVolumeSpecName: "host") pod "dc8b2510-5ca0-4d2a-8d3a-e325306586cb" (UID: "dc8b2510-5ca0-4d2a-8d3a-e325306586cb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.159039 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-host\") on node \"crc\" DevicePath \"\"" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.164033 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6" (OuterVolumeSpecName: "kube-api-access-82xb6") pod "dc8b2510-5ca0-4d2a-8d3a-e325306586cb" (UID: "dc8b2510-5ca0-4d2a-8d3a-e325306586cb"). InnerVolumeSpecName "kube-api-access-82xb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.262082 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xb6\" (UniqueName: \"kubernetes.io/projected/dc8b2510-5ca0-4d2a-8d3a-e325306586cb-kube-api-access-82xb6\") on node \"crc\" DevicePath \"\"" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.494279 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8b2510-5ca0-4d2a-8d3a-e325306586cb" path="/var/lib/kubelet/pods/dc8b2510-5ca0-4d2a-8d3a-e325306586cb/volumes" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.917083 4733 scope.go:117] "RemoveContainer" containerID="cb5e70fb08f1c6026ae68bfdd3ef53d622b6639f0ce0feeace075b8ad058d59e" Dec 06 06:39:02 crc kubenswrapper[4733]: I1206 06:39:02.917142 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-727m5" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.225112 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v97q5/crc-debug-vsrcj"] Dec 06 06:39:03 crc kubenswrapper[4733]: E1206 06:39:03.225736 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8b2510-5ca0-4d2a-8d3a-e325306586cb" containerName="container-00" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.225753 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8b2510-5ca0-4d2a-8d3a-e325306586cb" containerName="container-00" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.226002 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8b2510-5ca0-4d2a-8d3a-e325306586cb" containerName="container-00" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.226822 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.228916 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v97q5"/"default-dockercfg-dz6cm" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.380271 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.380787 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zls7b\" (UniqueName: \"kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.482755 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.482815 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zls7b\" (UniqueName: \"kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.482920 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.485418 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:39:03 crc kubenswrapper[4733]: E1206 06:39:03.485680 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.499848 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zls7b\" (UniqueName: \"kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b\") pod \"crc-debug-vsrcj\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.543514 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:03 crc kubenswrapper[4733]: W1206 06:39:03.578173 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c822e4e_1b4b_4aa0_9a86_11274cfd80ee.slice/crio-3e4f13b22291f95bba7cb18dbedd214259643395d6e864a17b3d81c35a573d14 WatchSource:0}: Error finding container 3e4f13b22291f95bba7cb18dbedd214259643395d6e864a17b3d81c35a573d14: Status 404 returned error can't find the container with id 3e4f13b22291f95bba7cb18dbedd214259643395d6e864a17b3d81c35a573d14 Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.929110 4733 generic.go:334] "Generic (PLEG): container finished" podID="3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" containerID="35b4058a8074a41cf3e99cb9842733175abc4ec5f2247e5f40c4f1de3ecc9207" exitCode=1 Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.929199 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" event={"ID":"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee","Type":"ContainerDied","Data":"35b4058a8074a41cf3e99cb9842733175abc4ec5f2247e5f40c4f1de3ecc9207"} Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.929716 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" event={"ID":"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee","Type":"ContainerStarted","Data":"3e4f13b22291f95bba7cb18dbedd214259643395d6e864a17b3d81c35a573d14"} Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.983521 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v97q5/crc-debug-vsrcj"] Dec 06 06:39:03 crc kubenswrapper[4733]: I1206 06:39:03.995591 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v97q5/crc-debug-vsrcj"] Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.028605 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.115517 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zls7b\" (UniqueName: \"kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b\") pod \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.115582 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host\") pod \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\" (UID: \"3c822e4e-1b4b-4aa0-9a86-11274cfd80ee\") " Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.115873 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host" (OuterVolumeSpecName: "host") pod "3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" (UID: "3c822e4e-1b4b-4aa0-9a86-11274cfd80ee"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.123264 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b" (OuterVolumeSpecName: "kube-api-access-zls7b") pod "3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" (UID: "3c822e4e-1b4b-4aa0-9a86-11274cfd80ee"). InnerVolumeSpecName "kube-api-access-zls7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.217418 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zls7b\" (UniqueName: \"kubernetes.io/projected/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-kube-api-access-zls7b\") on node \"crc\" DevicePath \"\"" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.217449 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee-host\") on node \"crc\" DevicePath \"\"" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.952151 4733 scope.go:117] "RemoveContainer" containerID="35b4058a8074a41cf3e99cb9842733175abc4ec5f2247e5f40c4f1de3ecc9207" Dec 06 06:39:05 crc kubenswrapper[4733]: I1206 06:39:05.952193 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/crc-debug-vsrcj" Dec 06 06:39:06 crc kubenswrapper[4733]: I1206 06:39:06.495819 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" path="/var/lib/kubelet/pods/3c822e4e-1b4b-4aa0-9a86-11274cfd80ee/volumes" Dec 06 06:39:14 crc kubenswrapper[4733]: I1206 06:39:14.485853 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:39:14 crc kubenswrapper[4733]: E1206 06:39:14.486875 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:24 crc kubenswrapper[4733]: I1206 06:39:24.881699 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7bf5d6f884-5w5rz_126b09fd-ddf0-4e25-bfab-28f73ca04e50/barbican-api/0.log" Dec 06 06:39:24 crc kubenswrapper[4733]: I1206 06:39:24.988609 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7bf5d6f884-5w5rz_126b09fd-ddf0-4e25-bfab-28f73ca04e50/barbican-api-log/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.064137 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6cb949947d-nv4s5_c874c7fc-ab63-41e8-8e5d-921aa5f09e9e/barbican-keystone-listener/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.097345 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6cb949947d-nv4s5_c874c7fc-ab63-41e8-8e5d-921aa5f09e9e/barbican-keystone-listener-log/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.179421 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c84494675-5wvrl_0a1b0724-0e18-475b-9f9f-c96bf13e371a/barbican-worker/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.238054 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c84494675-5wvrl_0a1b0724-0e18-475b-9f9f-c96bf13e371a/barbican-worker-log/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.365579 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv_35843bd8-0d3b-485a-b88f-95933d4c559e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.413088 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/ceilometer-central-agent/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.541892 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/ceilometer-notification-agent/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.581527 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/sg-core/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.595940 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/proxy-httpd/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.760795 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2a73638-cf54-461c-a23a-db691593febc/cinder-api-log/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.837828 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a870eae1-25fa-4c68-824e-e14fcd1e98ec/cinder-scheduler/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.865752 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2a73638-cf54-461c-a23a-db691593febc/cinder-api/0.log" Dec 06 06:39:25 crc kubenswrapper[4733]: I1206 06:39:25.936436 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a870eae1-25fa-4c68-824e-e14fcd1e98ec/probe/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.022572 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nndq6_f455cdaa-f9af-41b7-8bb3-379d347251ef/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.125293 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w268k_2de963da-76cb-41fe-9761-8eb801b393a9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.185485 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/init/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.521465 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/init/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.544334 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/dnsmasq-dns/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.561900 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk_833ce9dd-3791-4da1-9f16-fb8db6d4c205/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.723167 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d54b6c3c-a2f1-45a1-97f3-a9e95b37f075/glance-log/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.740273 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d54b6c3c-a2f1-45a1-97f3-a9e95b37f075/glance-httpd/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.885797 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe/glance-log/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.934087 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe/glance-httpd/0.log" Dec 06 06:39:26 crc kubenswrapper[4733]: I1206 06:39:26.937008 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd_10ed3cae-fa08-4e62-af5f-e45711123cb3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.096668 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kk2mg_d0b060af-fc34-4b9f-ad66-0ebcd23e5146/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.153273 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416681-vtzbw_56c9830c-0996-408f-bb43-8d6e2d0eaa2a/keystone-cron/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.344173 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_02729323-8acf-44d3-8eec-3194d7531769/kube-state-metrics/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.593157 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ffs52_b0eeb4fd-32c5-425a-b938-49572817e476/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.634278 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-fff9b86f5-qw8vr_4dfea320-4713-41d2-8d4a-ca371c346e9a/keystone-api/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.911823 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dbbf764c5-qntcx_96238eea-ea50-4c05-a33c-ae44b8c7a055/neutron-httpd/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.940537 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_56e3883e-d7a5-4735-aee1-9dbb5423c0fe/memcached/0.log" Dec 06 06:39:27 crc kubenswrapper[4733]: I1206 06:39:27.979437 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dbbf764c5-qntcx_96238eea-ea50-4c05-a33c-ae44b8c7a055/neutron-api/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.137655 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc_868cd7d4-8d73-4344-a16d-c4975b6d9249/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.485340 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:39:28 crc kubenswrapper[4733]: E1206 06:39:28.485643 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.535455 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c0f33a48-03f3-4580-8cfb-e6cc7d720ba4/nova-cell0-conductor-conductor/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.535551 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f9bd130-962d-4315-b471-987273048485/nova-api-log/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.631435 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_eeac3d18-b33c-41ec-b72d-4300358e4a96/nova-cell1-conductor-conductor/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.707050 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f9bd130-962d-4315-b471-987273048485/nova-api-api/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.781408 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_76922258-485d-4796-9f72-528ec9ec5b24/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.885444 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q5gb8_ee5d47d4-6f8e-45b9-ac60-208196cbb5d7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:28 crc kubenswrapper[4733]: I1206 06:39:28.961938 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8731fbb5-bb48-4c17-9ab9-6a5584868dc2/nova-metadata-log/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.173481 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5/nova-scheduler-scheduler/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.210026 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/mysql-bootstrap/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.352578 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/mysql-bootstrap/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.425192 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/mysql-bootstrap/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.460331 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/galera/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.555919 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/mysql-bootstrap/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.627752 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/galera/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.684341 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8731fbb5-bb48-4c17-9ab9-6a5584868dc2/nova-metadata-metadata/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.685799 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8b5bd873-9187-4b04-9274-fd413c995524/openstackclient/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.816051 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2ztw7_5589595d-741e-424a-955a-6fc8b83c18c1/ovn-controller/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.879686 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4bj7k_6e9f6fed-9267-40ab-a945-b575dd0abc9a/openstack-network-exporter/0.log" Dec 06 06:39:29 crc kubenswrapper[4733]: I1206 06:39:29.965825 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server-init/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.103055 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.106329 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovs-vswitchd/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.161757 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server-init/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.198515 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2z74q_a8617856-6710-492d-9bfd-8acd53e89b30/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.284088 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5ecd64ea-f2f0-4858-8e2e-de61f1d62d26/openstack-network-exporter/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.318276 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5ecd64ea-f2f0-4858-8e2e-de61f1d62d26/ovn-northd/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.368562 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b87d6517-a2ed-458a-9a0e-0945f837a232/openstack-network-exporter/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.454451 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b87d6517-a2ed-458a-9a0e-0945f837a232/ovsdbserver-nb/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.508069 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d0484be5-bcc0-4b5b-8aef-6c9573545b88/openstack-network-exporter/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.558329 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d0484be5-bcc0-4b5b-8aef-6c9573545b88/ovsdbserver-sb/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.670323 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8696d9b56-5s4w8_2e839961-eb72-4d81-baf8-b49f103a8ca0/placement-api/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.735802 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8696d9b56-5s4w8_2e839961-eb72-4d81-baf8-b49f103a8ca0/placement-log/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.757360 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/setup-container/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.945438 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/rabbitmq/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.948039 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/setup-container/0.log" Dec 06 06:39:30 crc kubenswrapper[4733]: I1206 06:39:30.975532 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/setup-container/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.085492 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/setup-container/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.112686 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/rabbitmq/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.139035 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq_e2a8649e-0504-47bc-8cea-c95c34f5e416/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.282410 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg_e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.291608 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-769q2_c6d6b59f-d8e8-4f50-9100-a0c789e93a8a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.363011 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bfgxd_a5943cb5-9495-43c5-8171-5c6a2df81c31/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.461481 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-54h4l_a3c3e208-9936-4b7d-b7f4-73683f20fc47/ssh-known-hosts-edpm-deployment/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.571360 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67756896f9-p6bgt_80fbf061-d2a6-4265-b412-cbbcdc78515f/proxy-httpd/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.588182 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67756896f9-p6bgt_80fbf061-d2a6-4265-b412-cbbcdc78515f/proxy-server/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.647067 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s28f8_1d720cd5-bb4e-449f-86a3-c9cff2acfada/swift-ring-rebalance/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.757956 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-auditor/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.792529 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-reaper/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.863924 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-server/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.879175 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-replicator/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.891369 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-auditor/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.929511 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-replicator/0.log" Dec 06 06:39:31 crc kubenswrapper[4733]: I1206 06:39:31.971425 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-server/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.026218 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-updater/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.041768 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-auditor/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.066609 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-expirer/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.088771 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-replicator/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.155675 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-server/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.190563 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/rsync/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.212778 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-updater/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.249525 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/swift-recon-cron/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.348934 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6_a0e43d41-4e58-4467-99d0-b782a2f2d65a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.381154 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_677c0cf0-716e-467c-ac8b-0cd446fb11ed/tempest-tests-tempest-tests-runner/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.480173 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_53459053-4118-4161-9dc3-dc7781c1f182/test-operator-logs-container/0.log" Dec 06 06:39:32 crc kubenswrapper[4733]: I1206 06:39:32.558374 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd_27d67df7-7cb0-4c5b-ba49-00d9285e1e11/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:39:40 crc kubenswrapper[4733]: I1206 06:39:40.485436 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:39:40 crc kubenswrapper[4733]: E1206 06:39:40.487103 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.563142 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.684278 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.711128 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.758716 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.849242 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.850885 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:39:50 crc kubenswrapper[4733]: I1206 06:39:50.871811 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/extract/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.031048 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ht2mw_f138e9fa-e1ea-4b04-b938-0c16b8205fbe/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.070027 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ht2mw_f138e9fa-e1ea-4b04-b938-0c16b8205fbe/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.121193 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-vrztj_8bbee3a7-9d6f-40d8-a5c2-eee560458e41/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.216990 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-vrztj_8bbee3a7-9d6f-40d8-a5c2-eee560458e41/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.240209 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-rgm8m_715c2050-78f9-4609-9575-a1c85c3b4961/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.291075 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-rgm8m_715c2050-78f9-4609-9575-a1c85c3b4961/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.394115 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xcx5w_42977681-d2c6-4ddb-848a-751503543ed4/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.474457 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xcx5w_42977681-d2c6-4ddb-848a-751503543ed4/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.577067 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-b64rz_018f851e-0c42-4bbd-bea7-7ce45a6e6ebb/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.586339 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-b64rz_018f851e-0c42-4bbd-bea7-7ce45a6e6ebb/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.663487 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2lblw_331a0926-1e6c-4976-b309-b20537eae22a/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.747726 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2lblw_331a0926-1e6c-4976-b309-b20537eae22a/manager/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.825579 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hcfrz_0d1ec2a9-eb8b-48b1-a823-129b8cc68129/kube-rbac-proxy/0.log" Dec 06 06:39:51 crc kubenswrapper[4733]: I1206 06:39:51.958491 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-t7chb_efc4f270-9152-42b0-bd6c-074697502758/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.009996 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hcfrz_0d1ec2a9-eb8b-48b1-a823-129b8cc68129/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.029622 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-t7chb_efc4f270-9152-42b0-bd6c-074697502758/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.189694 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-c9lc9_45064622-664d-4424-a01c-0cf85f653a67/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.270177 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-c9lc9_45064622-664d-4424-a01c-0cf85f653a67/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.344198 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-26bnr_bccabc0c-9ad2-47f8-b550-8bff11a103e8/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.368665 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-26bnr_bccabc0c-9ad2-47f8-b550-8bff11a103e8/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.426661 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7kmwq_fc37a812-3bfe-4e10-ba93-4e8fdc45361f/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.484858 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:39:52 crc kubenswrapper[4733]: E1206 06:39:52.485199 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.563078 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7kmwq_fc37a812-3bfe-4e10-ba93-4e8fdc45361f/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.583906 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-j9mvd_1ed48735-3f0e-4777-b3ce-54a09caec1ab/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.680733 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-j9mvd_1ed48735-3f0e-4777-b3ce-54a09caec1ab/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.753699 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kcq5s_a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.794531 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kcq5s_a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb/manager/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.909055 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wwdnq_1bd3247c-9536-44e7-8857-c9fe8aa31383/kube-rbac-proxy/0.log" Dec 06 06:39:52 crc kubenswrapper[4733]: I1206 06:39:52.915785 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wwdnq_1bd3247c-9536-44e7-8857-c9fe8aa31383/manager/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.035978 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f58g9vl_bfb7e815-5af6-428e-bfca-d47d2a7a3022/kube-rbac-proxy/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.091848 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f58g9vl_bfb7e815-5af6-428e-bfca-d47d2a7a3022/manager/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.505939 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-kmhl4_208b1115-af84-42b4-8425-a576457b38d2/operator/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.591217 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r7q28_ed9fa40d-152c-4e12-8ac4-ccf89c50ade2/registry-server/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.695222 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mcr2h_352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad/kube-rbac-proxy/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.751861 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mcr2h_352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad/manager/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.805890 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-h9stb_d46b59ae-938e-49f6-a9aa-2f78495634c3/kube-rbac-proxy/0.log" Dec 06 06:39:53 crc kubenswrapper[4733]: I1206 06:39:53.911185 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-h9stb_d46b59ae-938e-49f6-a9aa-2f78495634c3/manager/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.078523 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-cqrpd_4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed/operator/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.156161 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-pv7dl_5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c/kube-rbac-proxy/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.198066 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-pv7dl_5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c/manager/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.232787 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-ckqkj_4f778c13-06e7-4b71-98b8-28e3165cdf8b/manager/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.308549 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2m7gg_eef61090-130b-4d9d-99e8-6cc4bff0b467/kube-rbac-proxy/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.390925 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2m7gg_eef61090-130b-4d9d-99e8-6cc4bff0b467/manager/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.426511 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nbzjs_e744adbb-1e4c-4461-8892-799f8a42976f/kube-rbac-proxy/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.495166 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nbzjs_e744adbb-1e4c-4461-8892-799f8a42976f/manager/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.537750 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-9p759_62d11d7c-5132-4e22-9780-2ff475c07618/kube-rbac-proxy/0.log" Dec 06 06:39:54 crc kubenswrapper[4733]: I1206 06:39:54.576569 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-9p759_62d11d7c-5132-4e22-9780-2ff475c07618/manager/0.log" Dec 06 06:40:04 crc kubenswrapper[4733]: I1206 06:40:04.485657 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:40:04 crc kubenswrapper[4733]: E1206 06:40:04.486692 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:40:10 crc kubenswrapper[4733]: I1206 06:40:10.171572 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v42cq_71229745-aa94-4aa5-90c8-95d65fcca563/control-plane-machine-set-operator/0.log" Dec 06 06:40:10 crc kubenswrapper[4733]: I1206 06:40:10.268232 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6l9dt_d330f5cc-abab-4367-902f-97e41685007f/kube-rbac-proxy/0.log" Dec 06 06:40:10 crc kubenswrapper[4733]: I1206 06:40:10.335464 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6l9dt_d330f5cc-abab-4367-902f-97e41685007f/machine-api-operator/0.log" Dec 06 06:40:19 crc kubenswrapper[4733]: I1206 06:40:19.486018 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:40:19 crc kubenswrapper[4733]: E1206 06:40:19.487818 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:40:20 crc kubenswrapper[4733]: I1206 06:40:20.759265 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-fr5kh_0e8869e6-7869-47c0-a412-4a4cfa676164/cert-manager-controller/0.log" Dec 06 06:40:20 crc kubenswrapper[4733]: I1206 06:40:20.866830 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wt468_69538b45-07e5-4c3d-a653-b10e62688290/cert-manager-cainjector/0.log" Dec 06 06:40:20 crc kubenswrapper[4733]: I1206 06:40:20.943876 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ptmkp_5d9f07d4-edb9-4fba-8043-dd05fe08afbb/cert-manager-webhook/0.log" Dec 06 06:40:31 crc kubenswrapper[4733]: I1206 06:40:31.504468 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8rcsc_9574c53f-a694-4fe7-a2f3-c2292bf727c1/nmstate-console-plugin/0.log" Dec 06 06:40:31 crc kubenswrapper[4733]: I1206 06:40:31.660089 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lwn4x_a88cd06a-4aaf-4fcd-984e-9839be379e86/nmstate-handler/0.log" Dec 06 06:40:31 crc kubenswrapper[4733]: I1206 06:40:31.689802 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qbfxz_4b3f3780-1188-4d8c-b369-f01efb0060ae/kube-rbac-proxy/0.log" Dec 06 06:40:31 crc kubenswrapper[4733]: I1206 06:40:31.768190 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qbfxz_4b3f3780-1188-4d8c-b369-f01efb0060ae/nmstate-metrics/0.log" Dec 06 06:40:31 crc kubenswrapper[4733]: I1206 06:40:31.990063 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-98ngj_e54c5e3d-c07f-408d-85a8-83eee0ccfc79/nmstate-operator/0.log" Dec 06 06:40:32 crc kubenswrapper[4733]: I1206 06:40:32.079748 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-gw2n8_20f6b3c1-8c56-4603-8efc-c5aa7e3420cb/nmstate-webhook/0.log" Dec 06 06:40:33 crc kubenswrapper[4733]: I1206 06:40:33.485183 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:40:33 crc kubenswrapper[4733]: E1206 06:40:33.485841 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:40:44 crc kubenswrapper[4733]: I1206 06:40:44.669237 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vkbjg_bdd47a64-10b1-4dce-bec3-88d302bf60e7/kube-rbac-proxy/0.log" Dec 06 06:40:44 crc kubenswrapper[4733]: I1206 06:40:44.790695 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vkbjg_bdd47a64-10b1-4dce-bec3-88d302bf60e7/controller/0.log" Dec 06 06:40:44 crc kubenswrapper[4733]: I1206 06:40:44.880012 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.013229 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.030230 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.033096 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.039585 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.207181 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.213779 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.223778 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.234392 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.389560 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.400294 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.419055 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/controller/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.428975 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.484515 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.562977 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/frr-metrics/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.613754 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/kube-rbac-proxy/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.642860 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/kube-rbac-proxy-frr/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.794530 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/reloader/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.835569 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mqg7r_cd3104a0-fb18-4c26-9049-19967b2d5060/frr-k8s-webhook-server/0.log" Dec 06 06:40:45 crc kubenswrapper[4733]: I1206 06:40:45.838141 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf"} Dec 06 06:40:46 crc kubenswrapper[4733]: I1206 06:40:46.061153 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58dbb79b86-frmc6_652ac185-5c87-4354-9f1d-0c103702a926/manager/0.log" Dec 06 06:40:46 crc kubenswrapper[4733]: I1206 06:40:46.189509 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69cb7d5cf9-6jpn6_ad526f45-2ff7-4236-8ed5-161860544782/webhook-server/0.log" Dec 06 06:40:46 crc kubenswrapper[4733]: I1206 06:40:46.284802 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l9xh2_6361843f-1584-4663-840b-7891442d913f/kube-rbac-proxy/0.log" Dec 06 06:40:46 crc kubenswrapper[4733]: I1206 06:40:46.783062 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l9xh2_6361843f-1584-4663-840b-7891442d913f/speaker/0.log" Dec 06 06:40:46 crc kubenswrapper[4733]: I1206 06:40:46.884095 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/frr/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.527478 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.656155 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.690535 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.691465 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.851377 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.855202 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:40:57 crc kubenswrapper[4733]: I1206 06:40:57.867714 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/extract/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.010957 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.192935 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.193237 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.193628 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.369768 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.393960 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.407425 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/extract/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.527328 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.846215 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.857709 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.863703 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:40:58 crc kubenswrapper[4733]: I1206 06:40:58.996142 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.023092 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.179583 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.389448 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/registry-server/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.428950 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.447458 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.478660 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.622607 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.644989 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.842483 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/3.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.944888 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/registry-server/0.log" Dec 06 06:40:59 crc kubenswrapper[4733]: I1206 06:40:59.958170 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/2.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.085502 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.254777 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.255158 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.284258 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.366964 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.399532 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.520654 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/registry-server/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.568716 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.752174 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.753261 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.758058 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.882819 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:41:00 crc kubenswrapper[4733]: I1206 06:41:00.905717 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:41:01 crc kubenswrapper[4733]: I1206 06:41:01.266026 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/registry-server/0.log" Dec 06 06:41:46 crc kubenswrapper[4733]: I1206 06:41:46.997582 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:41:46 crc kubenswrapper[4733]: E1206 06:41:46.998580 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" containerName="container-00" Dec 06 06:41:46 crc kubenswrapper[4733]: I1206 06:41:46.998595 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" containerName="container-00" Dec 06 06:41:46 crc kubenswrapper[4733]: I1206 06:41:46.998775 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c822e4e-1b4b-4aa0-9a86-11274cfd80ee" containerName="container-00" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.000064 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.013267 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.105267 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.105374 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtjh\" (UniqueName: \"kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.105399 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.196512 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.198323 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.207432 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.207500 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtjh\" (UniqueName: \"kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.207522 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.207977 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.208193 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.213630 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.232827 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtjh\" (UniqueName: \"kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh\") pod \"redhat-marketplace-95c22\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.309036 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.309254 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrv2f\" (UniqueName: \"kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.309826 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.316491 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.414369 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrv2f\" (UniqueName: \"kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.414874 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.414991 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.415391 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.415489 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.432318 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrv2f\" (UniqueName: \"kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f\") pod \"community-operators-9svwg\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.526162 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.743613 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:41:47 crc kubenswrapper[4733]: I1206 06:41:47.982791 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.412936 4733 generic.go:334] "Generic (PLEG): container finished" podID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerID="168f5f31f6da5a5d9956b2df046d81cc09f14b700f13214e342d5cfbb397133f" exitCode=0 Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.412995 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerDied","Data":"168f5f31f6da5a5d9956b2df046d81cc09f14b700f13214e342d5cfbb397133f"} Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.413075 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerStarted","Data":"5fdd6dab7b6491bd583854ac7b939e30b8c92e877d551dd0f2823c06ae82f65f"} Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.415119 4733 generic.go:334] "Generic (PLEG): container finished" podID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerID="ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f" exitCode=0 Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.415169 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerDied","Data":"ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f"} Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.415202 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerStarted","Data":"c202aec2fce584557816b169182fdfb359395e3c6558e0ad5779da84d2c5f476"} Dec 06 06:41:48 crc kubenswrapper[4733]: I1206 06:41:48.415236 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:41:49 crc kubenswrapper[4733]: I1206 06:41:49.429822 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerStarted","Data":"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2"} Dec 06 06:41:49 crc kubenswrapper[4733]: I1206 06:41:49.432565 4733 generic.go:334] "Generic (PLEG): container finished" podID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerID="e6f71074fc3f4f61c10810c3c4c1d8bcacbd18086dca1a1e12cf29ce7b768574" exitCode=0 Dec 06 06:41:49 crc kubenswrapper[4733]: I1206 06:41:49.432695 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerDied","Data":"e6f71074fc3f4f61c10810c3c4c1d8bcacbd18086dca1a1e12cf29ce7b768574"} Dec 06 06:41:50 crc kubenswrapper[4733]: I1206 06:41:50.444541 4733 generic.go:334] "Generic (PLEG): container finished" podID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerID="532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2" exitCode=0 Dec 06 06:41:50 crc kubenswrapper[4733]: I1206 06:41:50.444627 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerDied","Data":"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2"} Dec 06 06:41:50 crc kubenswrapper[4733]: I1206 06:41:50.448153 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerStarted","Data":"965e782160691f6e533c2d9c6d8137c71f97a32b98ddba0445795f569675c628"} Dec 06 06:41:50 crc kubenswrapper[4733]: I1206 06:41:50.482112 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-95c22" podStartSLOduration=2.793227161 podStartE2EDuration="4.48209523s" podCreationTimestamp="2025-12-06 06:41:46 +0000 UTC" firstStartedPulling="2025-12-06 06:41:48.414948938 +0000 UTC m=+3492.280160050" lastFinishedPulling="2025-12-06 06:41:50.103817008 +0000 UTC m=+3493.969028119" observedRunningTime="2025-12-06 06:41:50.479563259 +0000 UTC m=+3494.344774361" watchObservedRunningTime="2025-12-06 06:41:50.48209523 +0000 UTC m=+3494.347306341" Dec 06 06:41:51 crc kubenswrapper[4733]: I1206 06:41:51.465500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerStarted","Data":"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9"} Dec 06 06:41:51 crc kubenswrapper[4733]: I1206 06:41:51.486901 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9svwg" podStartSLOduration=2.034206086 podStartE2EDuration="4.486879939s" podCreationTimestamp="2025-12-06 06:41:47 +0000 UTC" firstStartedPulling="2025-12-06 06:41:48.416454199 +0000 UTC m=+3492.281665310" lastFinishedPulling="2025-12-06 06:41:50.869128052 +0000 UTC m=+3494.734339163" observedRunningTime="2025-12-06 06:41:51.480713294 +0000 UTC m=+3495.345924405" watchObservedRunningTime="2025-12-06 06:41:51.486879939 +0000 UTC m=+3495.352091049" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.325741 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.326441 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.371835 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.527686 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.527730 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.552936 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:41:57 crc kubenswrapper[4733]: I1206 06:41:57.575566 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:41:58 crc kubenswrapper[4733]: I1206 06:41:58.567799 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.198536 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.199456 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-95c22" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="registry-server" containerID="cri-o://965e782160691f6e533c2d9c6d8137c71f97a32b98ddba0445795f569675c628" gracePeriod=2 Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.570093 4733 generic.go:334] "Generic (PLEG): container finished" podID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerID="965e782160691f6e533c2d9c6d8137c71f97a32b98ddba0445795f569675c628" exitCode=0 Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.570219 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerDied","Data":"965e782160691f6e533c2d9c6d8137c71f97a32b98ddba0445795f569675c628"} Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.570463 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95c22" event={"ID":"294564b7-90cb-4ef0-8bdf-6b125cd76b6c","Type":"ContainerDied","Data":"5fdd6dab7b6491bd583854ac7b939e30b8c92e877d551dd0f2823c06ae82f65f"} Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.570485 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fdd6dab7b6491bd583854ac7b939e30b8c92e877d551dd0f2823c06ae82f65f" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.599003 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.704537 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content\") pod \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.704572 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvtjh\" (UniqueName: \"kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh\") pod \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.704657 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities\") pod \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\" (UID: \"294564b7-90cb-4ef0-8bdf-6b125cd76b6c\") " Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.706508 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities" (OuterVolumeSpecName: "utilities") pod "294564b7-90cb-4ef0-8bdf-6b125cd76b6c" (UID: "294564b7-90cb-4ef0-8bdf-6b125cd76b6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.712217 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh" (OuterVolumeSpecName: "kube-api-access-zvtjh") pod "294564b7-90cb-4ef0-8bdf-6b125cd76b6c" (UID: "294564b7-90cb-4ef0-8bdf-6b125cd76b6c"). InnerVolumeSpecName "kube-api-access-zvtjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.722669 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "294564b7-90cb-4ef0-8bdf-6b125cd76b6c" (UID: "294564b7-90cb-4ef0-8bdf-6b125cd76b6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.795555 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.795772 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9svwg" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="registry-server" containerID="cri-o://42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9" gracePeriod=2 Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.808367 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.808396 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvtjh\" (UniqueName: \"kubernetes.io/projected/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-kube-api-access-zvtjh\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:02 crc kubenswrapper[4733]: I1206 06:42:02.808413 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/294564b7-90cb-4ef0-8bdf-6b125cd76b6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.127893 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.317348 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrv2f\" (UniqueName: \"kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f\") pod \"be369ab4-a311-452c-a8be-b5bc6aaac03a\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.317433 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content\") pod \"be369ab4-a311-452c-a8be-b5bc6aaac03a\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.317470 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities\") pod \"be369ab4-a311-452c-a8be-b5bc6aaac03a\" (UID: \"be369ab4-a311-452c-a8be-b5bc6aaac03a\") " Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.318221 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities" (OuterVolumeSpecName: "utilities") pod "be369ab4-a311-452c-a8be-b5bc6aaac03a" (UID: "be369ab4-a311-452c-a8be-b5bc6aaac03a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.323501 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f" (OuterVolumeSpecName: "kube-api-access-hrv2f") pod "be369ab4-a311-452c-a8be-b5bc6aaac03a" (UID: "be369ab4-a311-452c-a8be-b5bc6aaac03a"). InnerVolumeSpecName "kube-api-access-hrv2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.367794 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be369ab4-a311-452c-a8be-b5bc6aaac03a" (UID: "be369ab4-a311-452c-a8be-b5bc6aaac03a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.420496 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrv2f\" (UniqueName: \"kubernetes.io/projected/be369ab4-a311-452c-a8be-b5bc6aaac03a-kube-api-access-hrv2f\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.420523 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.420535 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be369ab4-a311-452c-a8be-b5bc6aaac03a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.581875 4733 generic.go:334] "Generic (PLEG): container finished" podID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerID="42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9" exitCode=0 Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.581993 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svwg" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.582470 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95c22" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.582018 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerDied","Data":"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9"} Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.583158 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svwg" event={"ID":"be369ab4-a311-452c-a8be-b5bc6aaac03a","Type":"ContainerDied","Data":"c202aec2fce584557816b169182fdfb359395e3c6558e0ad5779da84d2c5f476"} Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.583186 4733 scope.go:117] "RemoveContainer" containerID="42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.615087 4733 scope.go:117] "RemoveContainer" containerID="532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.627375 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.641462 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-95c22"] Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.647168 4733 scope.go:117] "RemoveContainer" containerID="ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.653497 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.659575 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9svwg"] Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.672513 4733 scope.go:117] "RemoveContainer" containerID="42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9" Dec 06 06:42:03 crc kubenswrapper[4733]: E1206 06:42:03.672900 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9\": container with ID starting with 42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9 not found: ID does not exist" containerID="42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.672945 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9"} err="failed to get container status \"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9\": rpc error: code = NotFound desc = could not find container \"42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9\": container with ID starting with 42e2faf7c4950fc7b7d95126e757066c864e5e9cd21b9bc8e02e4406b86c3ce9 not found: ID does not exist" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.672968 4733 scope.go:117] "RemoveContainer" containerID="532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2" Dec 06 06:42:03 crc kubenswrapper[4733]: E1206 06:42:03.673417 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2\": container with ID starting with 532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2 not found: ID does not exist" containerID="532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.673439 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2"} err="failed to get container status \"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2\": rpc error: code = NotFound desc = could not find container \"532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2\": container with ID starting with 532b3895e0d6928c34fa732cee4a17a354d12abb09a97174e67a3a61bb6c30f2 not found: ID does not exist" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.673453 4733 scope.go:117] "RemoveContainer" containerID="ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f" Dec 06 06:42:03 crc kubenswrapper[4733]: E1206 06:42:03.673718 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f\": container with ID starting with ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f not found: ID does not exist" containerID="ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f" Dec 06 06:42:03 crc kubenswrapper[4733]: I1206 06:42:03.673741 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f"} err="failed to get container status \"ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f\": rpc error: code = NotFound desc = could not find container \"ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f\": container with ID starting with ae400ac17733b522e891e4ed4f6dce5cf809a7a79fda7dee196c23db4efa801f not found: ID does not exist" Dec 06 06:42:04 crc kubenswrapper[4733]: I1206 06:42:04.494601 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" path="/var/lib/kubelet/pods/294564b7-90cb-4ef0-8bdf-6b125cd76b6c/volumes" Dec 06 06:42:04 crc kubenswrapper[4733]: I1206 06:42:04.495280 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" path="/var/lib/kubelet/pods/be369ab4-a311-452c-a8be-b5bc6aaac03a/volumes" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.005256 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007131 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="extract-content" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007205 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="extract-content" Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007271 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="extract-content" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007351 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="extract-content" Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007426 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="extract-utilities" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007480 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="extract-utilities" Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007535 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007586 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007658 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007708 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: E1206 06:42:13.007777 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="extract-utilities" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.007829 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="extract-utilities" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.008129 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="be369ab4-a311-452c-a8be-b5bc6aaac03a" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.008206 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="294564b7-90cb-4ef0-8bdf-6b125cd76b6c" containerName="registry-server" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.009730 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.013648 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.095490 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thcmp\" (UniqueName: \"kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.095822 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.095951 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.198123 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.198290 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thcmp\" (UniqueName: \"kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.198393 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.198648 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.198867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.231605 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thcmp\" (UniqueName: \"kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp\") pod \"redhat-operators-dhljr\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.331337 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:13 crc kubenswrapper[4733]: I1206 06:42:13.760267 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:14 crc kubenswrapper[4733]: I1206 06:42:14.684967 4733 generic.go:334] "Generic (PLEG): container finished" podID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerID="ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945" exitCode=0 Dec 06 06:42:14 crc kubenswrapper[4733]: I1206 06:42:14.685056 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerDied","Data":"ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945"} Dec 06 06:42:14 crc kubenswrapper[4733]: I1206 06:42:14.686488 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerStarted","Data":"76f382dde3be2e9821294030a2ed7449227e2f215d5ab73b1c3f058b5d19dbee"} Dec 06 06:42:15 crc kubenswrapper[4733]: I1206 06:42:15.699912 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerStarted","Data":"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478"} Dec 06 06:42:16 crc kubenswrapper[4733]: I1206 06:42:16.711092 4733 generic.go:334] "Generic (PLEG): container finished" podID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerID="71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478" exitCode=0 Dec 06 06:42:16 crc kubenswrapper[4733]: I1206 06:42:16.711177 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerDied","Data":"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478"} Dec 06 06:42:17 crc kubenswrapper[4733]: I1206 06:42:17.722993 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerStarted","Data":"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c"} Dec 06 06:42:17 crc kubenswrapper[4733]: I1206 06:42:17.742270 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dhljr" podStartSLOduration=3.222654372 podStartE2EDuration="5.742256692s" podCreationTimestamp="2025-12-06 06:42:12 +0000 UTC" firstStartedPulling="2025-12-06 06:42:14.686647051 +0000 UTC m=+3518.551858163" lastFinishedPulling="2025-12-06 06:42:17.206249372 +0000 UTC m=+3521.071460483" observedRunningTime="2025-12-06 06:42:17.739857081 +0000 UTC m=+3521.605068192" watchObservedRunningTime="2025-12-06 06:42:17.742256692 +0000 UTC m=+3521.607467794" Dec 06 06:42:23 crc kubenswrapper[4733]: I1206 06:42:23.331573 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:23 crc kubenswrapper[4733]: I1206 06:42:23.332231 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:23 crc kubenswrapper[4733]: I1206 06:42:23.375384 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:23 crc kubenswrapper[4733]: I1206 06:42:23.817374 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:23 crc kubenswrapper[4733]: I1206 06:42:23.870001 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:24 crc kubenswrapper[4733]: I1206 06:42:24.789051 4733 generic.go:334] "Generic (PLEG): container finished" podID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerID="96ef91edcfabee3332b97853ff978a70c62189b5833d5612c41ec5a6e26f6291" exitCode=0 Dec 06 06:42:24 crc kubenswrapper[4733]: I1206 06:42:24.789159 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v97q5/must-gather-wwhz9" event={"ID":"a7619f33-d848-42f3-8aa0-98d0339f1f1e","Type":"ContainerDied","Data":"96ef91edcfabee3332b97853ff978a70c62189b5833d5612c41ec5a6e26f6291"} Dec 06 06:42:24 crc kubenswrapper[4733]: I1206 06:42:24.790078 4733 scope.go:117] "RemoveContainer" containerID="96ef91edcfabee3332b97853ff978a70c62189b5833d5612c41ec5a6e26f6291" Dec 06 06:42:25 crc kubenswrapper[4733]: I1206 06:42:25.531056 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v97q5_must-gather-wwhz9_a7619f33-d848-42f3-8aa0-98d0339f1f1e/gather/0.log" Dec 06 06:42:25 crc kubenswrapper[4733]: I1206 06:42:25.801546 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dhljr" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="registry-server" containerID="cri-o://217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c" gracePeriod=2 Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.228287 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.367479 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thcmp\" (UniqueName: \"kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp\") pod \"f2bb5172-9651-4373-b0af-6fca9c8678f7\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.367689 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content\") pod \"f2bb5172-9651-4373-b0af-6fca9c8678f7\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.367806 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities\") pod \"f2bb5172-9651-4373-b0af-6fca9c8678f7\" (UID: \"f2bb5172-9651-4373-b0af-6fca9c8678f7\") " Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.369233 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities" (OuterVolumeSpecName: "utilities") pod "f2bb5172-9651-4373-b0af-6fca9c8678f7" (UID: "f2bb5172-9651-4373-b0af-6fca9c8678f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.380004 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp" (OuterVolumeSpecName: "kube-api-access-thcmp") pod "f2bb5172-9651-4373-b0af-6fca9c8678f7" (UID: "f2bb5172-9651-4373-b0af-6fca9c8678f7"). InnerVolumeSpecName "kube-api-access-thcmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.467128 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2bb5172-9651-4373-b0af-6fca9c8678f7" (UID: "f2bb5172-9651-4373-b0af-6fca9c8678f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.471074 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.471129 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2bb5172-9651-4373-b0af-6fca9c8678f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.471142 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thcmp\" (UniqueName: \"kubernetes.io/projected/f2bb5172-9651-4373-b0af-6fca9c8678f7-kube-api-access-thcmp\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.814730 4733 generic.go:334] "Generic (PLEG): container finished" podID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerID="217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c" exitCode=0 Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.814784 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerDied","Data":"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c"} Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.814844 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhljr" event={"ID":"f2bb5172-9651-4373-b0af-6fca9c8678f7","Type":"ContainerDied","Data":"76f382dde3be2e9821294030a2ed7449227e2f215d5ab73b1c3f058b5d19dbee"} Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.814865 4733 scope.go:117] "RemoveContainer" containerID="217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.814919 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhljr" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.844549 4733 scope.go:117] "RemoveContainer" containerID="71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.849163 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.856240 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dhljr"] Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.864959 4733 scope.go:117] "RemoveContainer" containerID="ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.893343 4733 scope.go:117] "RemoveContainer" containerID="217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c" Dec 06 06:42:26 crc kubenswrapper[4733]: E1206 06:42:26.893651 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c\": container with ID starting with 217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c not found: ID does not exist" containerID="217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.893697 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c"} err="failed to get container status \"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c\": rpc error: code = NotFound desc = could not find container \"217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c\": container with ID starting with 217f3ce9fcbba6c4086d6215411747a16b7aaad9598e6c142a9005b7328a112c not found: ID does not exist" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.893727 4733 scope.go:117] "RemoveContainer" containerID="71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478" Dec 06 06:42:26 crc kubenswrapper[4733]: E1206 06:42:26.894097 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478\": container with ID starting with 71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478 not found: ID does not exist" containerID="71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.894130 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478"} err="failed to get container status \"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478\": rpc error: code = NotFound desc = could not find container \"71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478\": container with ID starting with 71beff596e08af431eeed9fd3213cd893959cf560094e208134353b799271478 not found: ID does not exist" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.894150 4733 scope.go:117] "RemoveContainer" containerID="ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945" Dec 06 06:42:26 crc kubenswrapper[4733]: E1206 06:42:26.894488 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945\": container with ID starting with ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945 not found: ID does not exist" containerID="ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945" Dec 06 06:42:26 crc kubenswrapper[4733]: I1206 06:42:26.894629 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945"} err="failed to get container status \"ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945\": rpc error: code = NotFound desc = could not find container \"ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945\": container with ID starting with ff7a087b67bb17aed980cb6adf8f85cfa286f6a0598462f169051ca53c782945 not found: ID does not exist" Dec 06 06:42:28 crc kubenswrapper[4733]: I1206 06:42:28.500142 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" path="/var/lib/kubelet/pods/f2bb5172-9651-4373-b0af-6fca9c8678f7/volumes" Dec 06 06:42:32 crc kubenswrapper[4733]: I1206 06:42:32.645032 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v97q5/must-gather-wwhz9"] Dec 06 06:42:32 crc kubenswrapper[4733]: I1206 06:42:32.645875 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v97q5/must-gather-wwhz9" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="copy" containerID="cri-o://5ba8e5e1473b496c58e5a42c14ec97058a1b61f4b2bee60baebb6bd038d68f2d" gracePeriod=2 Dec 06 06:42:32 crc kubenswrapper[4733]: I1206 06:42:32.651809 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v97q5/must-gather-wwhz9"] Dec 06 06:42:32 crc kubenswrapper[4733]: I1206 06:42:32.891905 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v97q5_must-gather-wwhz9_a7619f33-d848-42f3-8aa0-98d0339f1f1e/copy/0.log" Dec 06 06:42:32 crc kubenswrapper[4733]: I1206 06:42:32.892725 4733 generic.go:334] "Generic (PLEG): container finished" podID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerID="5ba8e5e1473b496c58e5a42c14ec97058a1b61f4b2bee60baebb6bd038d68f2d" exitCode=143 Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.053779 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v97q5_must-gather-wwhz9_a7619f33-d848-42f3-8aa0-98d0339f1f1e/copy/0.log" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.054277 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.204612 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output\") pod \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.204859 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd\") pod \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\" (UID: \"a7619f33-d848-42f3-8aa0-98d0339f1f1e\") " Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.211182 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd" (OuterVolumeSpecName: "kube-api-access-xpjkd") pod "a7619f33-d848-42f3-8aa0-98d0339f1f1e" (UID: "a7619f33-d848-42f3-8aa0-98d0339f1f1e"). InnerVolumeSpecName "kube-api-access-xpjkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.308205 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/a7619f33-d848-42f3-8aa0-98d0339f1f1e-kube-api-access-xpjkd\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.334897 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a7619f33-d848-42f3-8aa0-98d0339f1f1e" (UID: "a7619f33-d848-42f3-8aa0-98d0339f1f1e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.410101 4733 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7619f33-d848-42f3-8aa0-98d0339f1f1e-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.905569 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v97q5_must-gather-wwhz9_a7619f33-d848-42f3-8aa0-98d0339f1f1e/copy/0.log" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.907230 4733 scope.go:117] "RemoveContainer" containerID="5ba8e5e1473b496c58e5a42c14ec97058a1b61f4b2bee60baebb6bd038d68f2d" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.907326 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v97q5/must-gather-wwhz9" Dec 06 06:42:33 crc kubenswrapper[4733]: I1206 06:42:33.929404 4733 scope.go:117] "RemoveContainer" containerID="96ef91edcfabee3332b97853ff978a70c62189b5833d5612c41ec5a6e26f6291" Dec 06 06:42:34 crc kubenswrapper[4733]: I1206 06:42:34.495699 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" path="/var/lib/kubelet/pods/a7619f33-d848-42f3-8aa0-98d0339f1f1e/volumes" Dec 06 06:43:12 crc kubenswrapper[4733]: I1206 06:43:12.989169 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:43:12 crc kubenswrapper[4733]: I1206 06:43:12.989831 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:43:42 crc kubenswrapper[4733]: I1206 06:43:42.989294 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:43:42 crc kubenswrapper[4733]: I1206 06:43:42.990254 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:44:12 crc kubenswrapper[4733]: I1206 06:44:12.989088 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:44:12 crc kubenswrapper[4733]: I1206 06:44:12.989741 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:44:12 crc kubenswrapper[4733]: I1206 06:44:12.989789 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:44:12 crc kubenswrapper[4733]: I1206 06:44:12.990361 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:44:12 crc kubenswrapper[4733]: I1206 06:44:12.990411 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf" gracePeriod=600 Dec 06 06:44:13 crc kubenswrapper[4733]: I1206 06:44:13.809256 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf" exitCode=0 Dec 06 06:44:13 crc kubenswrapper[4733]: I1206 06:44:13.809359 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf"} Dec 06 06:44:13 crc kubenswrapper[4733]: I1206 06:44:13.809984 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerStarted","Data":"f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820"} Dec 06 06:44:13 crc kubenswrapper[4733]: I1206 06:44:13.810029 4733 scope.go:117] "RemoveContainer" containerID="0e2824d80f32f689599ffbc13b3712ce74df00c0a5b8b8663ecaa737779a27f1" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.065429 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7vm9c/must-gather-77jfp"] Dec 06 06:44:35 crc kubenswrapper[4733]: E1206 06:44:35.066341 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="gather" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066355 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="gather" Dec 06 06:44:35 crc kubenswrapper[4733]: E1206 06:44:35.066375 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="extract-utilities" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066381 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="extract-utilities" Dec 06 06:44:35 crc kubenswrapper[4733]: E1206 06:44:35.066392 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="extract-content" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066399 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="extract-content" Dec 06 06:44:35 crc kubenswrapper[4733]: E1206 06:44:35.066419 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="registry-server" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066426 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="registry-server" Dec 06 06:44:35 crc kubenswrapper[4733]: E1206 06:44:35.066437 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="copy" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066442 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="copy" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066606 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="copy" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066617 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7619f33-d848-42f3-8aa0-98d0339f1f1e" containerName="gather" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.066623 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bb5172-9651-4373-b0af-6fca9c8678f7" containerName="registry-server" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.067559 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.070700 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7vm9c"/"openshift-service-ca.crt" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.070701 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7vm9c"/"kube-root-ca.crt" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.091577 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7vm9c/must-gather-77jfp"] Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.116577 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2kgv\" (UniqueName: \"kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.116885 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.218771 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2kgv\" (UniqueName: \"kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.219167 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.219593 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.234587 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2kgv\" (UniqueName: \"kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv\") pod \"must-gather-77jfp\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.384185 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:44:35 crc kubenswrapper[4733]: I1206 06:44:35.810853 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7vm9c/must-gather-77jfp"] Dec 06 06:44:36 crc kubenswrapper[4733]: I1206 06:44:36.008708 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/must-gather-77jfp" event={"ID":"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317","Type":"ContainerStarted","Data":"8d8ba465137d1b5e949a867a3a2766822d3cfed989a73c185850ad3a3e36d279"} Dec 06 06:44:37 crc kubenswrapper[4733]: I1206 06:44:37.019358 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/must-gather-77jfp" event={"ID":"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317","Type":"ContainerStarted","Data":"1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b"} Dec 06 06:44:37 crc kubenswrapper[4733]: I1206 06:44:37.019735 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/must-gather-77jfp" event={"ID":"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317","Type":"ContainerStarted","Data":"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a"} Dec 06 06:44:37 crc kubenswrapper[4733]: I1206 06:44:37.038221 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7vm9c/must-gather-77jfp" podStartSLOduration=2.038203388 podStartE2EDuration="2.038203388s" podCreationTimestamp="2025-12-06 06:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:44:37.031337679 +0000 UTC m=+3660.896548791" watchObservedRunningTime="2025-12-06 06:44:37.038203388 +0000 UTC m=+3660.903414499" Dec 06 06:44:38 crc kubenswrapper[4733]: E1206 06:44:38.364911 4733 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.211:42832->192.168.25.211:33957: read tcp 192.168.25.211:42832->192.168.25.211:33957: read: connection reset by peer Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.189401 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-hx668"] Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.190715 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.192536 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7vm9c"/"default-dockercfg-nrbkr" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.195049 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn24d\" (UniqueName: \"kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.195235 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.297508 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn24d\" (UniqueName: \"kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.297607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.297757 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.314413 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn24d\" (UniqueName: \"kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d\") pod \"crc-debug-hx668\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: I1206 06:44:39.506471 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:39 crc kubenswrapper[4733]: W1206 06:44:39.547018 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc35e0b4_67d1_4ee5_a8a5_9f5115bda407.slice/crio-479b1bf17094868e451f4598d51d62ff5a5ffe38a44ced30feb1782e4f625872 WatchSource:0}: Error finding container 479b1bf17094868e451f4598d51d62ff5a5ffe38a44ced30feb1782e4f625872: Status 404 returned error can't find the container with id 479b1bf17094868e451f4598d51d62ff5a5ffe38a44ced30feb1782e4f625872 Dec 06 06:44:40 crc kubenswrapper[4733]: I1206 06:44:40.046975 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/crc-debug-hx668" event={"ID":"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407","Type":"ContainerStarted","Data":"e8f0db44f9d2c5f9280df390dbbb8099ceea55198c5017db45a54e8ccedc0d11"} Dec 06 06:44:40 crc kubenswrapper[4733]: I1206 06:44:40.047648 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/crc-debug-hx668" event={"ID":"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407","Type":"ContainerStarted","Data":"479b1bf17094868e451f4598d51d62ff5a5ffe38a44ced30feb1782e4f625872"} Dec 06 06:44:40 crc kubenswrapper[4733]: I1206 06:44:40.068694 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7vm9c/crc-debug-hx668" podStartSLOduration=1.068676068 podStartE2EDuration="1.068676068s" podCreationTimestamp="2025-12-06 06:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:44:40.060410557 +0000 UTC m=+3663.925621668" watchObservedRunningTime="2025-12-06 06:44:40.068676068 +0000 UTC m=+3663.933887179" Dec 06 06:44:49 crc kubenswrapper[4733]: I1206 06:44:49.125421 4733 generic.go:334] "Generic (PLEG): container finished" podID="fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" containerID="e8f0db44f9d2c5f9280df390dbbb8099ceea55198c5017db45a54e8ccedc0d11" exitCode=0 Dec 06 06:44:49 crc kubenswrapper[4733]: I1206 06:44:49.125532 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/crc-debug-hx668" event={"ID":"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407","Type":"ContainerDied","Data":"e8f0db44f9d2c5f9280df390dbbb8099ceea55198c5017db45a54e8ccedc0d11"} Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.225830 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.254548 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-hx668"] Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.261047 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-hx668"] Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.422319 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host\") pod \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.422459 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host" (OuterVolumeSpecName: "host") pod "fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" (UID: "fc35e0b4-67d1-4ee5-a8a5-9f5115bda407"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.422613 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn24d\" (UniqueName: \"kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d\") pod \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\" (UID: \"fc35e0b4-67d1-4ee5-a8a5-9f5115bda407\") " Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.423527 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-host\") on node \"crc\" DevicePath \"\"" Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.429520 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d" (OuterVolumeSpecName: "kube-api-access-rn24d") pod "fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" (UID: "fc35e0b4-67d1-4ee5-a8a5-9f5115bda407"). InnerVolumeSpecName "kube-api-access-rn24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.492999 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" path="/var/lib/kubelet/pods/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407/volumes" Dec 06 06:44:50 crc kubenswrapper[4733]: I1206 06:44:50.525200 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn24d\" (UniqueName: \"kubernetes.io/projected/fc35e0b4-67d1-4ee5-a8a5-9f5115bda407-kube-api-access-rn24d\") on node \"crc\" DevicePath \"\"" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.144352 4733 scope.go:117] "RemoveContainer" containerID="e8f0db44f9d2c5f9280df390dbbb8099ceea55198c5017db45a54e8ccedc0d11" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.144451 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-hx668" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.427608 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-vlhbb"] Dec 06 06:44:51 crc kubenswrapper[4733]: E1206 06:44:51.428048 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" containerName="container-00" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.428062 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" containerName="container-00" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.428264 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc35e0b4-67d1-4ee5-a8a5-9f5115bda407" containerName="container-00" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.428865 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.437776 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7vm9c"/"default-dockercfg-nrbkr" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.444179 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.444316 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkv24\" (UniqueName: \"kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.546766 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.546627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.546974 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkv24\" (UniqueName: \"kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.562066 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkv24\" (UniqueName: \"kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24\") pod \"crc-debug-vlhbb\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: I1206 06:44:51.743990 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:51 crc kubenswrapper[4733]: W1206 06:44:51.773972 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eb98a2b_b14d_4d67_9ee2_e85e504431d2.slice/crio-f1949493ab5cf3fecabe0a10b4d3a7e9dac059381a93cd630a351efeed6cf872 WatchSource:0}: Error finding container f1949493ab5cf3fecabe0a10b4d3a7e9dac059381a93cd630a351efeed6cf872: Status 404 returned error can't find the container with id f1949493ab5cf3fecabe0a10b4d3a7e9dac059381a93cd630a351efeed6cf872 Dec 06 06:44:52 crc kubenswrapper[4733]: I1206 06:44:52.155246 4733 generic.go:334] "Generic (PLEG): container finished" podID="6eb98a2b-b14d-4d67-9ee2-e85e504431d2" containerID="65e8039aaffb69862d67392d00281d4626e8e3063aa4457fa97bfa6cee3671c0" exitCode=1 Dec 06 06:44:52 crc kubenswrapper[4733]: I1206 06:44:52.155418 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" event={"ID":"6eb98a2b-b14d-4d67-9ee2-e85e504431d2","Type":"ContainerDied","Data":"65e8039aaffb69862d67392d00281d4626e8e3063aa4457fa97bfa6cee3671c0"} Dec 06 06:44:52 crc kubenswrapper[4733]: I1206 06:44:52.155612 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" event={"ID":"6eb98a2b-b14d-4d67-9ee2-e85e504431d2","Type":"ContainerStarted","Data":"f1949493ab5cf3fecabe0a10b4d3a7e9dac059381a93cd630a351efeed6cf872"} Dec 06 06:44:52 crc kubenswrapper[4733]: I1206 06:44:52.203348 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-vlhbb"] Dec 06 06:44:52 crc kubenswrapper[4733]: I1206 06:44:52.208956 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7vm9c/crc-debug-vlhbb"] Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.245479 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.286473 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkv24\" (UniqueName: \"kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24\") pod \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.286543 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host\") pod \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\" (UID: \"6eb98a2b-b14d-4d67-9ee2-e85e504431d2\") " Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.286680 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host" (OuterVolumeSpecName: "host") pod "6eb98a2b-b14d-4d67-9ee2-e85e504431d2" (UID: "6eb98a2b-b14d-4d67-9ee2-e85e504431d2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.287152 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-host\") on node \"crc\" DevicePath \"\"" Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.294570 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24" (OuterVolumeSpecName: "kube-api-access-nkv24") pod "6eb98a2b-b14d-4d67-9ee2-e85e504431d2" (UID: "6eb98a2b-b14d-4d67-9ee2-e85e504431d2"). InnerVolumeSpecName "kube-api-access-nkv24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:44:53 crc kubenswrapper[4733]: I1206 06:44:53.388404 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkv24\" (UniqueName: \"kubernetes.io/projected/6eb98a2b-b14d-4d67-9ee2-e85e504431d2-kube-api-access-nkv24\") on node \"crc\" DevicePath \"\"" Dec 06 06:44:54 crc kubenswrapper[4733]: I1206 06:44:54.180117 4733 scope.go:117] "RemoveContainer" containerID="65e8039aaffb69862d67392d00281d4626e8e3063aa4457fa97bfa6cee3671c0" Dec 06 06:44:54 crc kubenswrapper[4733]: I1206 06:44:54.180437 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/crc-debug-vlhbb" Dec 06 06:44:54 crc kubenswrapper[4733]: I1206 06:44:54.493045 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb98a2b-b14d-4d67-9ee2-e85e504431d2" path="/var/lib/kubelet/pods/6eb98a2b-b14d-4d67-9ee2-e85e504431d2/volumes" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.158122 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj"] Dec 06 06:45:00 crc kubenswrapper[4733]: E1206 06:45:00.159230 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb98a2b-b14d-4d67-9ee2-e85e504431d2" containerName="container-00" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.159256 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb98a2b-b14d-4d67-9ee2-e85e504431d2" containerName="container-00" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.159462 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb98a2b-b14d-4d67-9ee2-e85e504431d2" containerName="container-00" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.162138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.172378 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.172616 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.177217 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj"] Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.349649 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.349739 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvr5r\" (UniqueName: \"kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.349791 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.452540 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.452666 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvr5r\" (UniqueName: \"kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.452726 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.453544 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.464036 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.475030 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvr5r\" (UniqueName: \"kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r\") pod \"collect-profiles-29416725-qq6pj\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.492795 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:00 crc kubenswrapper[4733]: I1206 06:45:00.907243 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj"] Dec 06 06:45:01 crc kubenswrapper[4733]: I1206 06:45:01.244594 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" event={"ID":"36725673-7334-432e-8258-88368dc2e84a","Type":"ContainerStarted","Data":"1bc14bf296a7fd4656fef78639cdecc38f307cb4b42e184b982e8271ab1dd4ab"} Dec 06 06:45:01 crc kubenswrapper[4733]: I1206 06:45:01.244665 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" event={"ID":"36725673-7334-432e-8258-88368dc2e84a","Type":"ContainerStarted","Data":"fe201e478a0c55d91a819bbef0dfee7d1cf15d7f797211550f4c75937ecd8abb"} Dec 06 06:45:01 crc kubenswrapper[4733]: I1206 06:45:01.267487 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" podStartSLOduration=1.267463545 podStartE2EDuration="1.267463545s" podCreationTimestamp="2025-12-06 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:45:01.260005032 +0000 UTC m=+3685.125216142" watchObservedRunningTime="2025-12-06 06:45:01.267463545 +0000 UTC m=+3685.132674656" Dec 06 06:45:02 crc kubenswrapper[4733]: I1206 06:45:02.255131 4733 generic.go:334] "Generic (PLEG): container finished" podID="36725673-7334-432e-8258-88368dc2e84a" containerID="1bc14bf296a7fd4656fef78639cdecc38f307cb4b42e184b982e8271ab1dd4ab" exitCode=0 Dec 06 06:45:02 crc kubenswrapper[4733]: I1206 06:45:02.255339 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" event={"ID":"36725673-7334-432e-8258-88368dc2e84a","Type":"ContainerDied","Data":"1bc14bf296a7fd4656fef78639cdecc38f307cb4b42e184b982e8271ab1dd4ab"} Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.574704 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.724649 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume\") pod \"36725673-7334-432e-8258-88368dc2e84a\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.724896 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume\") pod \"36725673-7334-432e-8258-88368dc2e84a\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.725113 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvr5r\" (UniqueName: \"kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r\") pod \"36725673-7334-432e-8258-88368dc2e84a\" (UID: \"36725673-7334-432e-8258-88368dc2e84a\") " Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.726605 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume" (OuterVolumeSpecName: "config-volume") pod "36725673-7334-432e-8258-88368dc2e84a" (UID: "36725673-7334-432e-8258-88368dc2e84a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.740860 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r" (OuterVolumeSpecName: "kube-api-access-jvr5r") pod "36725673-7334-432e-8258-88368dc2e84a" (UID: "36725673-7334-432e-8258-88368dc2e84a"). InnerVolumeSpecName "kube-api-access-jvr5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.747380 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36725673-7334-432e-8258-88368dc2e84a" (UID: "36725673-7334-432e-8258-88368dc2e84a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.827016 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36725673-7334-432e-8258-88368dc2e84a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.827055 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvr5r\" (UniqueName: \"kubernetes.io/projected/36725673-7334-432e-8258-88368dc2e84a-kube-api-access-jvr5r\") on node \"crc\" DevicePath \"\"" Dec 06 06:45:03 crc kubenswrapper[4733]: I1206 06:45:03.827069 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36725673-7334-432e-8258-88368dc2e84a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.278225 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" event={"ID":"36725673-7334-432e-8258-88368dc2e84a","Type":"ContainerDied","Data":"fe201e478a0c55d91a819bbef0dfee7d1cf15d7f797211550f4c75937ecd8abb"} Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.278271 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe201e478a0c55d91a819bbef0dfee7d1cf15d7f797211550f4c75937ecd8abb" Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.278338 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-qq6pj" Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.318032 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q"] Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.323180 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416680-cqp6q"] Dec 06 06:45:04 crc kubenswrapper[4733]: I1206 06:45:04.496257 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9b14ad-76ee-43dc-b948-28abf700d584" path="/var/lib/kubelet/pods/3e9b14ad-76ee-43dc-b948-28abf700d584/volumes" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.640830 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7bf5d6f884-5w5rz_126b09fd-ddf0-4e25-bfab-28f73ca04e50/barbican-api/0.log" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.665341 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7bf5d6f884-5w5rz_126b09fd-ddf0-4e25-bfab-28f73ca04e50/barbican-api-log/0.log" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.762695 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6cb949947d-nv4s5_c874c7fc-ab63-41e8-8e5d-921aa5f09e9e/barbican-keystone-listener/0.log" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.811159 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6cb949947d-nv4s5_c874c7fc-ab63-41e8-8e5d-921aa5f09e9e/barbican-keystone-listener-log/0.log" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.898430 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c84494675-5wvrl_0a1b0724-0e18-475b-9f9f-c96bf13e371a/barbican-worker/0.log" Dec 06 06:45:30 crc kubenswrapper[4733]: I1206 06:45:30.912316 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c84494675-5wvrl_0a1b0724-0e18-475b-9f9f-c96bf13e371a/barbican-worker-log/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.047409 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j4wkv_35843bd8-0d3b-485a-b88f-95933d4c559e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.122141 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/ceilometer-central-agent/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.201897 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/ceilometer-notification-agent/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.224881 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/proxy-httpd/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.285279 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1edd6e2c-20a9-4584-aa48-64021a2911d3/sg-core/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.391103 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2a73638-cf54-461c-a23a-db691593febc/cinder-api-log/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.568870 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a870eae1-25fa-4c68-824e-e14fcd1e98ec/cinder-scheduler/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.678998 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a870eae1-25fa-4c68-824e-e14fcd1e98ec/probe/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.689899 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2a73638-cf54-461c-a23a-db691593febc/cinder-api/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.769155 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nndq6_f455cdaa-f9af-41b7-8bb3-379d347251ef/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.857095 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w268k_2de963da-76cb-41fe-9761-8eb801b393a9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:31 crc kubenswrapper[4733]: I1206 06:45:31.946253 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/init/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.076175 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/init/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.132827 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8bb97999-hr2v6_470393bd-fe7a-49f4-90f0-3625e4bdb497/dnsmasq-dns/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.133116 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-t7mgk_833ce9dd-3791-4da1-9f16-fb8db6d4c205/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.345182 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d54b6c3c-a2f1-45a1-97f3-a9e95b37f075/glance-log/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.349586 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d54b6c3c-a2f1-45a1-97f3-a9e95b37f075/glance-httpd/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.516484 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe/glance-log/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.520425 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e808010-50b4-4eb5-8dcb-5fa7f7cd7abe/glance-httpd/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.604578 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-sg4xd_10ed3cae-fa08-4e62-af5f-e45711123cb3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.702505 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kk2mg_d0b060af-fc34-4b9f-ad66-0ebcd23e5146/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.769510 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416681-vtzbw_56c9830c-0996-408f-bb43-8d6e2d0eaa2a/keystone-cron/0.log" Dec 06 06:45:32 crc kubenswrapper[4733]: I1206 06:45:32.954035 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_02729323-8acf-44d3-8eec-3194d7531769/kube-state-metrics/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.214845 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ffs52_b0eeb4fd-32c5-425a-b938-49572817e476/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.234920 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-fff9b86f5-qw8vr_4dfea320-4713-41d2-8d4a-ca371c346e9a/keystone-api/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.461381 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dbbf764c5-qntcx_96238eea-ea50-4c05-a33c-ae44b8c7a055/neutron-httpd/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.536495 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dbbf764c5-qntcx_96238eea-ea50-4c05-a33c-ae44b8c7a055/neutron-api/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.579159 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_56e3883e-d7a5-4735-aee1-9dbb5423c0fe/memcached/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.613890 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mk7gc_868cd7d4-8d73-4344-a16d-c4975b6d9249/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:33 crc kubenswrapper[4733]: I1206 06:45:33.985316 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c0f33a48-03f3-4580-8cfb-e6cc7d720ba4/nova-cell0-conductor-conductor/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.103231 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f9bd130-962d-4315-b471-987273048485/nova-api-log/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.190407 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_eeac3d18-b33c-41ec-b72d-4300358e4a96/nova-cell1-conductor-conductor/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.288460 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_76922258-485d-4796-9f72-528ec9ec5b24/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.389725 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q5gb8_ee5d47d4-6f8e-45b9-ac60-208196cbb5d7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.419394 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f9bd130-962d-4315-b471-987273048485/nova-api-api/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.502784 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8731fbb5-bb48-4c17-9ab9-6a5584868dc2/nova-metadata-log/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.720567 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/mysql-bootstrap/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.763514 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b1ab4bcf-72c0-4aa4-8773-8cedd25ea6d5/nova-scheduler-scheduler/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.971972 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/mysql-bootstrap/0.log" Dec 06 06:45:34 crc kubenswrapper[4733]: I1206 06:45:34.994115 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3de44369-4819-44c5-a1e5-3ea10b61cf0c/galera/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.034961 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/mysql-bootstrap/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.540425 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8731fbb5-bb48-4c17-9ab9-6a5584868dc2/nova-metadata-metadata/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.571224 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/mysql-bootstrap/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.606539 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b2b2baf7-95ad-4ff0-a72d-9232137735b6/galera/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.630825 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8b5bd873-9187-4b04-9274-fd413c995524/openstackclient/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.751273 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2ztw7_5589595d-741e-424a-955a-6fc8b83c18c1/ovn-controller/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.766769 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4bj7k_6e9f6fed-9267-40ab-a945-b575dd0abc9a/openstack-network-exporter/0.log" Dec 06 06:45:35 crc kubenswrapper[4733]: I1206 06:45:35.912236 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server-init/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.026926 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server-init/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.050424 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovs-vswitchd/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.081232 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4wzzg_008ba5cf-a311-414d-9d06-a8ad4c038088/ovsdb-server/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.084142 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2z74q_a8617856-6710-492d-9bfd-8acd53e89b30/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.194915 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5ecd64ea-f2f0-4858-8e2e-de61f1d62d26/openstack-network-exporter/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.237628 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5ecd64ea-f2f0-4858-8e2e-de61f1d62d26/ovn-northd/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.261691 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b87d6517-a2ed-458a-9a0e-0945f837a232/openstack-network-exporter/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.353034 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b87d6517-a2ed-458a-9a0e-0945f837a232/ovsdbserver-nb/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.434335 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d0484be5-bcc0-4b5b-8aef-6c9573545b88/ovsdbserver-sb/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.437224 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d0484be5-bcc0-4b5b-8aef-6c9573545b88/openstack-network-exporter/0.log" Dec 06 06:45:36 crc kubenswrapper[4733]: I1206 06:45:36.702110 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8696d9b56-5s4w8_2e839961-eb72-4d81-baf8-b49f103a8ca0/placement-api/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.147769 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/setup-container/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.194690 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8696d9b56-5s4w8_2e839961-eb72-4d81-baf8-b49f103a8ca0/placement-log/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.290517 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/setup-container/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.306981 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_45ab4fd4-dfe1-4ef9-9470-9dfba19fd5f3/rabbitmq/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.374706 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/setup-container/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.486674 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/setup-container/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.497960 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0fa056f3-b465-4a24-9eb5-9a5f5932749c/rabbitmq/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.525323 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7qspq_e2a8649e-0504-47bc-8cea-c95c34f5e416/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.627373 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-769q2_c6d6b59f-d8e8-4f50-9100-a0c789e93a8a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.689519 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kw7zg_e86f6dc2-6ce2-42b0-b4dd-e583d23f5f7c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.753758 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bfgxd_a5943cb5-9495-43c5-8171-5c6a2df81c31/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.844185 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-54h4l_a3c3e208-9936-4b7d-b7f4-73683f20fc47/ssh-known-hosts-edpm-deployment/0.log" Dec 06 06:45:37 crc kubenswrapper[4733]: I1206 06:45:37.943336 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67756896f9-p6bgt_80fbf061-d2a6-4265-b412-cbbcdc78515f/proxy-server/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.015462 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s28f8_1d720cd5-bb4e-449f-86a3-c9cff2acfada/swift-ring-rebalance/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.020285 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67756896f9-p6bgt_80fbf061-d2a6-4265-b412-cbbcdc78515f/proxy-httpd/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.105288 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-auditor/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.157019 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-reaper/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.199940 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-replicator/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.234973 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/account-server/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.277895 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-auditor/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.347937 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-replicator/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.372795 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-updater/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.384867 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/container-server/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.421009 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-auditor/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.447753 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-expirer/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.520296 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-server/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.542805 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-replicator/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.586471 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/object-updater/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.613294 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/rsync/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.644896 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a55915f4-28cf-4343-aefa-e6b145b3ccf1/swift-recon-cron/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.771169 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7zdf6_a0e43d41-4e58-4467-99d0-b782a2f2d65a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.817815 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_677c0cf0-716e-467c-ac8b-0cd446fb11ed/tempest-tests-tempest-tests-runner/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.873578 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_53459053-4118-4161-9dc3-dc7781c1f182/test-operator-logs-container/0.log" Dec 06 06:45:38 crc kubenswrapper[4733]: I1206 06:45:38.936491 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pw8sd_27d67df7-7cb0-4c5b-ba49-00d9285e1e11/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 06:45:53 crc kubenswrapper[4733]: I1206 06:45:53.268491 4733 scope.go:117] "RemoveContainer" containerID="50537d06f9cce557c4894c5d5829e780c4e32fde7833bde64bfb594171246825" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.226782 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.383435 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.407098 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.439537 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.606059 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/util/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.622421 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/extract/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.629632 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafrjrkt_f7c69fa4-d047-47d2-a147-0316949d45c5/pull/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.765580 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ht2mw_f138e9fa-e1ea-4b04-b938-0c16b8205fbe/kube-rbac-proxy/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.852145 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-vrztj_8bbee3a7-9d6f-40d8-a5c2-eee560458e41/kube-rbac-proxy/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.855750 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ht2mw_f138e9fa-e1ea-4b04-b938-0c16b8205fbe/manager/0.log" Dec 06 06:45:58 crc kubenswrapper[4733]: I1206 06:45:58.962312 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-vrztj_8bbee3a7-9d6f-40d8-a5c2-eee560458e41/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.008782 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-rgm8m_715c2050-78f9-4609-9575-a1c85c3b4961/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.035510 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-rgm8m_715c2050-78f9-4609-9575-a1c85c3b4961/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.175925 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xcx5w_42977681-d2c6-4ddb-848a-751503543ed4/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.239765 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xcx5w_42977681-d2c6-4ddb-848a-751503543ed4/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.329710 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-b64rz_018f851e-0c42-4bbd-bea7-7ce45a6e6ebb/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.383221 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-b64rz_018f851e-0c42-4bbd-bea7-7ce45a6e6ebb/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.384056 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2lblw_331a0926-1e6c-4976-b309-b20537eae22a/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.499213 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2lblw_331a0926-1e6c-4976-b309-b20537eae22a/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.548793 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hcfrz_0d1ec2a9-eb8b-48b1-a823-129b8cc68129/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.702899 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hcfrz_0d1ec2a9-eb8b-48b1-a823-129b8cc68129/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.748869 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-t7chb_efc4f270-9152-42b0-bd6c-074697502758/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.768772 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-t7chb_efc4f270-9152-42b0-bd6c-074697502758/manager/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.905132 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-c9lc9_45064622-664d-4424-a01c-0cf85f653a67/kube-rbac-proxy/0.log" Dec 06 06:45:59 crc kubenswrapper[4733]: I1206 06:45:59.983682 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-c9lc9_45064622-664d-4424-a01c-0cf85f653a67/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.054539 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-26bnr_bccabc0c-9ad2-47f8-b550-8bff11a103e8/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.095244 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-26bnr_bccabc0c-9ad2-47f8-b550-8bff11a103e8/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.170870 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7kmwq_fc37a812-3bfe-4e10-ba93-4e8fdc45361f/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.246054 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7kmwq_fc37a812-3bfe-4e10-ba93-4e8fdc45361f/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.348507 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-j9mvd_1ed48735-3f0e-4777-b3ce-54a09caec1ab/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.383887 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-j9mvd_1ed48735-3f0e-4777-b3ce-54a09caec1ab/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.466030 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kcq5s_a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.589025 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kcq5s_a20c3b8a-0e57-4ba7-92f2-bf01e12bfedb/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.633429 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wwdnq_1bd3247c-9536-44e7-8857-c9fe8aa31383/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.667007 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wwdnq_1bd3247c-9536-44e7-8857-c9fe8aa31383/manager/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.781230 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f58g9vl_bfb7e815-5af6-428e-bfca-d47d2a7a3022/kube-rbac-proxy/0.log" Dec 06 06:46:00 crc kubenswrapper[4733]: I1206 06:46:00.784696 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f58g9vl_bfb7e815-5af6-428e-bfca-d47d2a7a3022/manager/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.174893 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-kmhl4_208b1115-af84-42b4-8425-a576457b38d2/operator/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.264207 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r7q28_ed9fa40d-152c-4e12-8ac4-ccf89c50ade2/registry-server/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.388170 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mcr2h_352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad/kube-rbac-proxy/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.506486 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-h9stb_d46b59ae-938e-49f6-a9aa-2f78495634c3/kube-rbac-proxy/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.584708 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mcr2h_352c73f3-4cd8-4c2b-a5ba-52c5bc1f78ad/manager/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.629981 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-h9stb_d46b59ae-938e-49f6-a9aa-2f78495634c3/manager/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.919193 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-pv7dl_5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c/kube-rbac-proxy/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.952871 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-ckqkj_4f778c13-06e7-4b71-98b8-28e3165cdf8b/manager/0.log" Dec 06 06:46:01 crc kubenswrapper[4733]: I1206 06:46:01.973514 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-cqrpd_4f2d4dbb-c7fb-46b3-8baf-fb1ac61a12ed/operator/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.044896 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-pv7dl_5b3eaa67-83e3-4c9a-bfeb-c315e4f5ac7c/manager/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.129066 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2m7gg_eef61090-130b-4d9d-99e8-6cc4bff0b467/kube-rbac-proxy/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.173969 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2m7gg_eef61090-130b-4d9d-99e8-6cc4bff0b467/manager/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.228203 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nbzjs_e744adbb-1e4c-4461-8892-799f8a42976f/kube-rbac-proxy/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.322058 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nbzjs_e744adbb-1e4c-4461-8892-799f8a42976f/manager/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.328093 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-9p759_62d11d7c-5132-4e22-9780-2ff475c07618/kube-rbac-proxy/0.log" Dec 06 06:46:02 crc kubenswrapper[4733]: I1206 06:46:02.358449 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-9p759_62d11d7c-5132-4e22-9780-2ff475c07618/manager/0.log" Dec 06 06:46:17 crc kubenswrapper[4733]: I1206 06:46:17.290196 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v42cq_71229745-aa94-4aa5-90c8-95d65fcca563/control-plane-machine-set-operator/0.log" Dec 06 06:46:17 crc kubenswrapper[4733]: I1206 06:46:17.405847 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6l9dt_d330f5cc-abab-4367-902f-97e41685007f/kube-rbac-proxy/0.log" Dec 06 06:46:17 crc kubenswrapper[4733]: I1206 06:46:17.411199 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6l9dt_d330f5cc-abab-4367-902f-97e41685007f/machine-api-operator/0.log" Dec 06 06:46:27 crc kubenswrapper[4733]: I1206 06:46:27.168261 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-fr5kh_0e8869e6-7869-47c0-a412-4a4cfa676164/cert-manager-controller/0.log" Dec 06 06:46:27 crc kubenswrapper[4733]: I1206 06:46:27.321206 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wt468_69538b45-07e5-4c3d-a653-b10e62688290/cert-manager-cainjector/0.log" Dec 06 06:46:27 crc kubenswrapper[4733]: I1206 06:46:27.370469 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ptmkp_5d9f07d4-edb9-4fba-8043-dd05fe08afbb/cert-manager-webhook/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.279881 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8rcsc_9574c53f-a694-4fe7-a2f3-c2292bf727c1/nmstate-console-plugin/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.422100 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lwn4x_a88cd06a-4aaf-4fcd-984e-9839be379e86/nmstate-handler/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.453179 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qbfxz_4b3f3780-1188-4d8c-b369-f01efb0060ae/kube-rbac-proxy/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.483924 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qbfxz_4b3f3780-1188-4d8c-b369-f01efb0060ae/nmstate-metrics/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.671447 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-gw2n8_20f6b3c1-8c56-4603-8efc-c5aa7e3420cb/nmstate-webhook/0.log" Dec 06 06:46:38 crc kubenswrapper[4733]: I1206 06:46:38.673753 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-98ngj_e54c5e3d-c07f-408d-85a8-83eee0ccfc79/nmstate-operator/0.log" Dec 06 06:46:42 crc kubenswrapper[4733]: I1206 06:46:42.989692 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:46:42 crc kubenswrapper[4733]: I1206 06:46:42.990201 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.419542 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vkbjg_bdd47a64-10b1-4dce-bec3-88d302bf60e7/kube-rbac-proxy/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.569196 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vkbjg_bdd47a64-10b1-4dce-bec3-88d302bf60e7/controller/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.624175 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.745832 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:46:50 crc kubenswrapper[4733]: E1206 06:46:50.746227 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36725673-7334-432e-8258-88368dc2e84a" containerName="collect-profiles" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.746246 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="36725673-7334-432e-8258-88368dc2e84a" containerName="collect-profiles" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.746486 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="36725673-7334-432e-8258-88368dc2e84a" containerName="collect-profiles" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.747780 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.768000 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.793766 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.798118 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.802238 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.860269 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.908334 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.908400 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pvl\" (UniqueName: \"kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.908474 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.970328 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:46:50 crc kubenswrapper[4733]: I1206 06:46:50.977800 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.010286 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.010374 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pvl\" (UniqueName: \"kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.010463 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.010862 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.010998 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.032771 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.037353 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pvl\" (UniqueName: \"kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl\") pod \"certified-operators-8tpxm\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.062883 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.067720 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.343220 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-frr-files/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.471297 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-reloader/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.477704 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/cp-metrics/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.534413 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/controller/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.554082 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.695852 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/frr-metrics/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.699369 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/kube-rbac-proxy/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.788698 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/kube-rbac-proxy-frr/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.881859 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/reloader/0.log" Dec 06 06:46:51 crc kubenswrapper[4733]: I1206 06:46:51.981945 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mqg7r_cd3104a0-fb18-4c26-9049-19967b2d5060/frr-k8s-webhook-server/0.log" Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.119237 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58dbb79b86-frmc6_652ac185-5c87-4354-9f1d-0c103702a926/manager/0.log" Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.290099 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69cb7d5cf9-6jpn6_ad526f45-2ff7-4236-8ed5-161860544782/webhook-server/0.log" Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.290423 4733 generic.go:334] "Generic (PLEG): container finished" podID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerID="6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29" exitCode=0 Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.290451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerDied","Data":"6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29"} Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.290688 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerStarted","Data":"9ccad64ab8932cc2e5611e0131350b52000f6a0969325b39f02a6042b7a3df5f"} Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.292672 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.437562 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l9xh2_6361843f-1584-4663-840b-7891442d913f/kube-rbac-proxy/0.log" Dec 06 06:46:52 crc kubenswrapper[4733]: I1206 06:46:52.969191 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l9xh2_6361843f-1584-4663-840b-7891442d913f/speaker/0.log" Dec 06 06:46:53 crc kubenswrapper[4733]: I1206 06:46:53.162060 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8l7xm_b6c5bf84-f86b-4a51-bd80-0a23163dd42b/frr/0.log" Dec 06 06:46:53 crc kubenswrapper[4733]: I1206 06:46:53.299320 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerStarted","Data":"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc"} Dec 06 06:46:54 crc kubenswrapper[4733]: I1206 06:46:54.317158 4733 generic.go:334] "Generic (PLEG): container finished" podID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerID="1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc" exitCode=0 Dec 06 06:46:54 crc kubenswrapper[4733]: I1206 06:46:54.317223 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerDied","Data":"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc"} Dec 06 06:46:55 crc kubenswrapper[4733]: I1206 06:46:55.327514 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerStarted","Data":"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a"} Dec 06 06:46:55 crc kubenswrapper[4733]: I1206 06:46:55.346166 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tpxm" podStartSLOduration=2.887399736 podStartE2EDuration="5.346155083s" podCreationTimestamp="2025-12-06 06:46:50 +0000 UTC" firstStartedPulling="2025-12-06 06:46:52.29243824 +0000 UTC m=+3796.157649351" lastFinishedPulling="2025-12-06 06:46:54.751193587 +0000 UTC m=+3798.616404698" observedRunningTime="2025-12-06 06:46:55.342583297 +0000 UTC m=+3799.207794408" watchObservedRunningTime="2025-12-06 06:46:55.346155083 +0000 UTC m=+3799.211366194" Dec 06 06:47:01 crc kubenswrapper[4733]: I1206 06:47:01.068012 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:01 crc kubenswrapper[4733]: I1206 06:47:01.068376 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:01 crc kubenswrapper[4733]: I1206 06:47:01.105610 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:01 crc kubenswrapper[4733]: I1206 06:47:01.407861 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:01 crc kubenswrapper[4733]: I1206 06:47:01.455882 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:47:02 crc kubenswrapper[4733]: I1206 06:47:02.708847 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:47:02 crc kubenswrapper[4733]: I1206 06:47:02.891466 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:47:02 crc kubenswrapper[4733]: I1206 06:47:02.919098 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:47:02 crc kubenswrapper[4733]: I1206 06:47:02.942404 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.052455 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/util/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.080828 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/pull/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.093656 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwp2wx_1537e842-7b52-4886-81ea-989848fc3407/extract/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.202860 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.350579 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.353357 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.385734 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tpxm" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="registry-server" containerID="cri-o://f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a" gracePeriod=2 Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.395927 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.535388 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/pull/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.558221 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/extract/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.590243 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wbs6g_cbcb89dd-b5ba-4b72-9a34-24048c6b7275/util/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.748043 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-utilities/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.783100 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.880995 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content\") pod \"cb08e0c1-0355-47c6-be10-cbd039ded687\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.881252 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities\") pod \"cb08e0c1-0355-47c6-be10-cbd039ded687\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.881277 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5pvl\" (UniqueName: \"kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl\") pod \"cb08e0c1-0355-47c6-be10-cbd039ded687\" (UID: \"cb08e0c1-0355-47c6-be10-cbd039ded687\") " Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.882067 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities" (OuterVolumeSpecName: "utilities") pod "cb08e0c1-0355-47c6-be10-cbd039ded687" (UID: "cb08e0c1-0355-47c6-be10-cbd039ded687"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.885699 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-utilities/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.887373 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl" (OuterVolumeSpecName: "kube-api-access-v5pvl") pod "cb08e0c1-0355-47c6-be10-cbd039ded687" (UID: "cb08e0c1-0355-47c6-be10-cbd039ded687"). InnerVolumeSpecName "kube-api-access-v5pvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.907614 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-content/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.915652 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-content/0.log" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.984538 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:47:03 crc kubenswrapper[4733]: I1206 06:47:03.984576 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5pvl\" (UniqueName: \"kubernetes.io/projected/cb08e0c1-0355-47c6-be10-cbd039ded687-kube-api-access-v5pvl\") on node \"crc\" DevicePath \"\"" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.021254 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.036359 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb08e0c1-0355-47c6-be10-cbd039ded687" (UID: "cb08e0c1-0355-47c6-be10-cbd039ded687"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.041697 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/extract-content/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.067270 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8tpxm_cb08e0c1-0355-47c6-be10-cbd039ded687/registry-server/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.085147 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb08e0c1-0355-47c6-be10-cbd039ded687-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.174335 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.340058 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.344583 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.356630 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.395330 4733 generic.go:334] "Generic (PLEG): container finished" podID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerID="f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a" exitCode=0 Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.395373 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerDied","Data":"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a"} Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.395406 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tpxm" event={"ID":"cb08e0c1-0355-47c6-be10-cbd039ded687","Type":"ContainerDied","Data":"9ccad64ab8932cc2e5611e0131350b52000f6a0969325b39f02a6042b7a3df5f"} Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.395426 4733 scope.go:117] "RemoveContainer" containerID="f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.395558 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tpxm" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.418295 4733 scope.go:117] "RemoveContainer" containerID="1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.427891 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.442093 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tpxm"] Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.449932 4733 scope.go:117] "RemoveContainer" containerID="6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.472256 4733 scope.go:117] "RemoveContainer" containerID="f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a" Dec 06 06:47:04 crc kubenswrapper[4733]: E1206 06:47:04.472777 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a\": container with ID starting with f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a not found: ID does not exist" containerID="f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.472818 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a"} err="failed to get container status \"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a\": rpc error: code = NotFound desc = could not find container \"f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a\": container with ID starting with f67efbd834713d775e9b24d34fc10c44746dea7e50fb42e0c230a717a2bff17a not found: ID does not exist" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.472846 4733 scope.go:117] "RemoveContainer" containerID="1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc" Dec 06 06:47:04 crc kubenswrapper[4733]: E1206 06:47:04.473164 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc\": container with ID starting with 1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc not found: ID does not exist" containerID="1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.473200 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc"} err="failed to get container status \"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc\": rpc error: code = NotFound desc = could not find container \"1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc\": container with ID starting with 1402b42ec61a41cddeef6e17f6841d6048c49379b967d9e39d367650dd32fbbc not found: ID does not exist" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.473224 4733 scope.go:117] "RemoveContainer" containerID="6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29" Dec 06 06:47:04 crc kubenswrapper[4733]: E1206 06:47:04.473503 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29\": container with ID starting with 6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29 not found: ID does not exist" containerID="6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.473526 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29"} err="failed to get container status \"6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29\": rpc error: code = NotFound desc = could not find container \"6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29\": container with ID starting with 6498d3fbf5ebe2b3f5e8aa1c07259442c29c9473d925b68eee9b99710f197f29 not found: ID does not exist" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.492590 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" path="/var/lib/kubelet/pods/cb08e0c1-0355-47c6-be10-cbd039ded687/volumes" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.548555 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.558042 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/extract-content/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.741298 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.873830 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8pgx_db6bf699-eb17-41f2-a2be-e30f7a341840/registry-server/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.912109 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.929545 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:47:04 crc kubenswrapper[4733]: I1206 06:47:04.970588 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.065292 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-utilities/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.137349 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/extract-content/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.286747 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/3.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.321323 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xq44c_477154e1-6166-41c9-beb3-1248e1583324/marketplace-operator/2.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.450665 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nrhdv_a08a3e05-bf85-4e28-bbe1-9a9675b9efd9/registry-server/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.479470 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.632799 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.674604 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.678591 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.847804 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-content/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.863201 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/extract-utilities/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.920252 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:47:05 crc kubenswrapper[4733]: I1206 06:47:05.978262 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76k6r_87626d39-c79f-487c-819c-95eec3d5d5a3/registry-server/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.067672 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.072059 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.090143 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.219433 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-utilities/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.251349 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/extract-content/0.log" Dec 06 06:47:06 crc kubenswrapper[4733]: I1206 06:47:06.646528 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzhq7_f4be7c3a-dabf-4f6d-8488-17f680198610/registry-server/0.log" Dec 06 06:47:12 crc kubenswrapper[4733]: I1206 06:47:12.989051 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:47:12 crc kubenswrapper[4733]: I1206 06:47:12.989400 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:47:42 crc kubenswrapper[4733]: I1206 06:47:42.989036 4733 patch_prober.go:28] interesting pod/machine-config-daemon-g7qjx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:47:42 crc kubenswrapper[4733]: I1206 06:47:42.989991 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:47:42 crc kubenswrapper[4733]: I1206 06:47:42.990063 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" Dec 06 06:47:42 crc kubenswrapper[4733]: I1206 06:47:42.991243 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820"} pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 06:47:42 crc kubenswrapper[4733]: I1206 06:47:42.991316 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerName="machine-config-daemon" containerID="cri-o://f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" gracePeriod=600 Dec 06 06:47:43 crc kubenswrapper[4733]: E1206 06:47:43.048583 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ab6d12_6a30_4bf0_a5a1_5a661b82f448.slice/crio-conmon-f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ab6d12_6a30_4bf0_a5a1_5a661b82f448.slice/crio-f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820.scope\": RecentStats: unable to find data in memory cache]" Dec 06 06:47:43 crc kubenswrapper[4733]: E1206 06:47:43.114682 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:47:43 crc kubenswrapper[4733]: I1206 06:47:43.764263 4733 generic.go:334] "Generic (PLEG): container finished" podID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" exitCode=0 Dec 06 06:47:43 crc kubenswrapper[4733]: I1206 06:47:43.764352 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" event={"ID":"b9ab6d12-6a30-4bf0-a5a1-5a661b82f448","Type":"ContainerDied","Data":"f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820"} Dec 06 06:47:43 crc kubenswrapper[4733]: I1206 06:47:43.764406 4733 scope.go:117] "RemoveContainer" containerID="43f30aa5837a5bee8e7ecac27f1d786b0dee6b7686b928f4dd37a1272c540bbf" Dec 06 06:47:43 crc kubenswrapper[4733]: I1206 06:47:43.765204 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:47:43 crc kubenswrapper[4733]: E1206 06:47:43.765568 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:47:53 crc kubenswrapper[4733]: I1206 06:47:53.391103 4733 scope.go:117] "RemoveContainer" containerID="e6f71074fc3f4f61c10810c3c4c1d8bcacbd18086dca1a1e12cf29ce7b768574" Dec 06 06:47:53 crc kubenswrapper[4733]: I1206 06:47:53.431555 4733 scope.go:117] "RemoveContainer" containerID="168f5f31f6da5a5d9956b2df046d81cc09f14b700f13214e342d5cfbb397133f" Dec 06 06:47:53 crc kubenswrapper[4733]: I1206 06:47:53.460207 4733 scope.go:117] "RemoveContainer" containerID="965e782160691f6e533c2d9c6d8137c71f97a32b98ddba0445795f569675c628" Dec 06 06:47:58 crc kubenswrapper[4733]: I1206 06:47:58.484693 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:47:58 crc kubenswrapper[4733]: E1206 06:47:58.485372 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:48:12 crc kubenswrapper[4733]: I1206 06:48:12.488902 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:48:12 crc kubenswrapper[4733]: E1206 06:48:12.489910 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:48:27 crc kubenswrapper[4733]: I1206 06:48:27.486552 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:48:27 crc kubenswrapper[4733]: E1206 06:48:27.487439 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:48:32 crc kubenswrapper[4733]: I1206 06:48:32.207267 4733 generic.go:334] "Generic (PLEG): container finished" podID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerID="e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a" exitCode=0 Dec 06 06:48:32 crc kubenswrapper[4733]: I1206 06:48:32.207411 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7vm9c/must-gather-77jfp" event={"ID":"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317","Type":"ContainerDied","Data":"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a"} Dec 06 06:48:32 crc kubenswrapper[4733]: I1206 06:48:32.208832 4733 scope.go:117] "RemoveContainer" containerID="e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a" Dec 06 06:48:33 crc kubenswrapper[4733]: I1206 06:48:33.144115 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7vm9c_must-gather-77jfp_c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317/gather/0.log" Dec 06 06:48:40 crc kubenswrapper[4733]: I1206 06:48:40.485129 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:48:40 crc kubenswrapper[4733]: E1206 06:48:40.485937 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:48:43 crc kubenswrapper[4733]: I1206 06:48:43.452353 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7vm9c/must-gather-77jfp"] Dec 06 06:48:43 crc kubenswrapper[4733]: I1206 06:48:43.453036 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7vm9c/must-gather-77jfp" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="copy" containerID="cri-o://1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b" gracePeriod=2 Dec 06 06:48:43 crc kubenswrapper[4733]: I1206 06:48:43.458168 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7vm9c/must-gather-77jfp"] Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.284154 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7vm9c_must-gather-77jfp_c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317/copy/0.log" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.285926 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.312414 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7vm9c_must-gather-77jfp_c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317/copy/0.log" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.312991 4733 generic.go:334] "Generic (PLEG): container finished" podID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerID="1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b" exitCode=143 Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.313044 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7vm9c/must-gather-77jfp" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.313055 4733 scope.go:117] "RemoveContainer" containerID="1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.337830 4733 scope.go:117] "RemoveContainer" containerID="e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.393135 4733 scope.go:117] "RemoveContainer" containerID="1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b" Dec 06 06:48:44 crc kubenswrapper[4733]: E1206 06:48:44.393646 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b\": container with ID starting with 1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b not found: ID does not exist" containerID="1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.393779 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b"} err="failed to get container status \"1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b\": rpc error: code = NotFound desc = could not find container \"1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b\": container with ID starting with 1d18abb4f8f8894105554956d8c5d00344fb7a4f424c255f53d76e43184bd33b not found: ID does not exist" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.393812 4733 scope.go:117] "RemoveContainer" containerID="e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a" Dec 06 06:48:44 crc kubenswrapper[4733]: E1206 06:48:44.394173 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a\": container with ID starting with e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a not found: ID does not exist" containerID="e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.394214 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a"} err="failed to get container status \"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a\": rpc error: code = NotFound desc = could not find container \"e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a\": container with ID starting with e4f8730860ebf8435f05aa060d12bd314597650a36a34ec48eb17c6769d4345a not found: ID does not exist" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.474500 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output\") pod \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.474708 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2kgv\" (UniqueName: \"kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv\") pod \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\" (UID: \"c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317\") " Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.480143 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv" (OuterVolumeSpecName: "kube-api-access-x2kgv") pod "c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" (UID: "c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317"). InnerVolumeSpecName "kube-api-access-x2kgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.577747 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2kgv\" (UniqueName: \"kubernetes.io/projected/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-kube-api-access-x2kgv\") on node \"crc\" DevicePath \"\"" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.587785 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" (UID: "c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:48:44 crc kubenswrapper[4733]: I1206 06:48:44.679704 4733 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 06:48:46 crc kubenswrapper[4733]: I1206 06:48:46.496403 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" path="/var/lib/kubelet/pods/c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317/volumes" Dec 06 06:48:54 crc kubenswrapper[4733]: I1206 06:48:54.484606 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:48:54 crc kubenswrapper[4733]: E1206 06:48:54.485456 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:49:05 crc kubenswrapper[4733]: I1206 06:49:05.485262 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:49:05 crc kubenswrapper[4733]: E1206 06:49:05.486091 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:49:16 crc kubenswrapper[4733]: I1206 06:49:16.490790 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:49:16 crc kubenswrapper[4733]: E1206 06:49:16.491691 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:49:30 crc kubenswrapper[4733]: I1206 06:49:30.484707 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:49:30 crc kubenswrapper[4733]: E1206 06:49:30.485466 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:49:42 crc kubenswrapper[4733]: I1206 06:49:42.485200 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:49:42 crc kubenswrapper[4733]: E1206 06:49:42.485977 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:49:56 crc kubenswrapper[4733]: I1206 06:49:56.489501 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:49:56 crc kubenswrapper[4733]: E1206 06:49:56.490213 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:50:11 crc kubenswrapper[4733]: I1206 06:50:11.485342 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:50:11 crc kubenswrapper[4733]: E1206 06:50:11.486114 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:50:23 crc kubenswrapper[4733]: I1206 06:50:23.485070 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:50:23 crc kubenswrapper[4733]: E1206 06:50:23.486061 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:50:37 crc kubenswrapper[4733]: I1206 06:50:37.485030 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:50:37 crc kubenswrapper[4733]: E1206 06:50:37.486112 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:50:50 crc kubenswrapper[4733]: I1206 06:50:50.485704 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:50:50 crc kubenswrapper[4733]: E1206 06:50:50.486494 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:51:02 crc kubenswrapper[4733]: I1206 06:51:02.484992 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:51:02 crc kubenswrapper[4733]: E1206 06:51:02.485703 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:51:13 crc kubenswrapper[4733]: I1206 06:51:13.485738 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:51:13 crc kubenswrapper[4733]: E1206 06:51:13.486368 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:51:27 crc kubenswrapper[4733]: I1206 06:51:27.484919 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:51:27 crc kubenswrapper[4733]: E1206 06:51:27.486788 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:51:38 crc kubenswrapper[4733]: I1206 06:51:38.484956 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:51:38 crc kubenswrapper[4733]: E1206 06:51:38.485622 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:51:50 crc kubenswrapper[4733]: I1206 06:51:50.484796 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:51:50 crc kubenswrapper[4733]: E1206 06:51:50.485500 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:52:02 crc kubenswrapper[4733]: I1206 06:52:02.484570 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:52:02 crc kubenswrapper[4733]: E1206 06:52:02.486332 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140056 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2cfgt"] Dec 06 06:52:09 crc kubenswrapper[4733]: E1206 06:52:09.140852 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="extract-utilities" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140868 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="extract-utilities" Dec 06 06:52:09 crc kubenswrapper[4733]: E1206 06:52:09.140881 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="gather" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140887 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="gather" Dec 06 06:52:09 crc kubenswrapper[4733]: E1206 06:52:09.140907 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="extract-content" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140913 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="extract-content" Dec 06 06:52:09 crc kubenswrapper[4733]: E1206 06:52:09.140934 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="copy" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140940 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="copy" Dec 06 06:52:09 crc kubenswrapper[4733]: E1206 06:52:09.140950 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="registry-server" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.140955 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="registry-server" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.141134 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb08e0c1-0355-47c6-be10-cbd039ded687" containerName="registry-server" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.141160 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="gather" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.141174 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91fb140-5b4b-4e6a-bc61-a5d5bdbe0317" containerName="copy" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.142468 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.150348 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2cfgt"] Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.187016 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lsn\" (UniqueName: \"kubernetes.io/projected/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-kube-api-access-q5lsn\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.187061 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-catalog-content\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.187087 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-utilities\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.288952 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-utilities\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.289135 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lsn\" (UniqueName: \"kubernetes.io/projected/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-kube-api-access-q5lsn\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.289159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-catalog-content\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.289612 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-utilities\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.289618 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-catalog-content\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.307621 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lsn\" (UniqueName: \"kubernetes.io/projected/959463fe-0a5b-4748-bee2-5d1a5d2b3a9e-kube-api-access-q5lsn\") pod \"community-operators-2cfgt\" (UID: \"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e\") " pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.457824 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.845178 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2cfgt"] Dec 06 06:52:09 crc kubenswrapper[4733]: I1206 06:52:09.956372 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2cfgt" event={"ID":"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e","Type":"ContainerStarted","Data":"1a970c37e3e1ada316e00f843a10391536408a493cd80565d0b8e65bc37a26e9"} Dec 06 06:52:10 crc kubenswrapper[4733]: I1206 06:52:10.965178 4733 generic.go:334] "Generic (PLEG): container finished" podID="959463fe-0a5b-4748-bee2-5d1a5d2b3a9e" containerID="f960902dd8e1e8353ab408a9d6f58b22fa0e6d0fdb8f381c074996d8b49cee1a" exitCode=0 Dec 06 06:52:10 crc kubenswrapper[4733]: I1206 06:52:10.965436 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2cfgt" event={"ID":"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e","Type":"ContainerDied","Data":"f960902dd8e1e8353ab408a9d6f58b22fa0e6d0fdb8f381c074996d8b49cee1a"} Dec 06 06:52:10 crc kubenswrapper[4733]: I1206 06:52:10.967078 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:52:11 crc kubenswrapper[4733]: I1206 06:52:11.975360 4733 generic.go:334] "Generic (PLEG): container finished" podID="959463fe-0a5b-4748-bee2-5d1a5d2b3a9e" containerID="2ed81cdd1bb7710f7d71c2ce66c23a330b1d0afe45ae02a68955c193b9338b87" exitCode=0 Dec 06 06:52:11 crc kubenswrapper[4733]: I1206 06:52:11.975446 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2cfgt" event={"ID":"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e","Type":"ContainerDied","Data":"2ed81cdd1bb7710f7d71c2ce66c23a330b1d0afe45ae02a68955c193b9338b87"} Dec 06 06:52:12 crc kubenswrapper[4733]: I1206 06:52:12.986491 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2cfgt" event={"ID":"959463fe-0a5b-4748-bee2-5d1a5d2b3a9e","Type":"ContainerStarted","Data":"397f80e1b4a78484e8b991bcdd894d295a743b8b28f50377445b7e8211fac4d9"} Dec 06 06:52:13 crc kubenswrapper[4733]: I1206 06:52:13.011768 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2cfgt" podStartSLOduration=2.524495726 podStartE2EDuration="4.011746867s" podCreationTimestamp="2025-12-06 06:52:09 +0000 UTC" firstStartedPulling="2025-12-06 06:52:10.966833827 +0000 UTC m=+4114.832044938" lastFinishedPulling="2025-12-06 06:52:12.454084968 +0000 UTC m=+4116.319296079" observedRunningTime="2025-12-06 06:52:13.002487357 +0000 UTC m=+4116.867698468" watchObservedRunningTime="2025-12-06 06:52:13.011746867 +0000 UTC m=+4116.876957978" Dec 06 06:52:15 crc kubenswrapper[4733]: I1206 06:52:15.486236 4733 scope.go:117] "RemoveContainer" containerID="f47cb59a321dfcb0efc8c6909ad3380200d5a7a5ab1d87ad8d40fc651e322820" Dec 06 06:52:15 crc kubenswrapper[4733]: E1206 06:52:15.486827 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g7qjx_openshift-machine-config-operator(b9ab6d12-6a30-4bf0-a5a1-5a661b82f448)\"" pod="openshift-machine-config-operator/machine-config-daemon-g7qjx" podUID="b9ab6d12-6a30-4bf0-a5a1-5a661b82f448" Dec 06 06:52:19 crc kubenswrapper[4733]: I1206 06:52:19.458763 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:19 crc kubenswrapper[4733]: I1206 06:52:19.459071 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:19 crc kubenswrapper[4733]: I1206 06:52:19.496716 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:20 crc kubenswrapper[4733]: I1206 06:52:20.075357 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2cfgt" Dec 06 06:52:20 crc kubenswrapper[4733]: I1206 06:52:20.120747 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2cfgt"]